|She's a future geek, just like daddy|
Technologist John C. Dvorak argued that in the 80s and 90s computers are more powerful than they are now – not because the chips carry higher MHz counts, but because of customizability. Back then if you want to achieve something, you have the basic tools to do it (e.g. MS Word) but if you need something more advanced you tell the computer to do it (i.e. write a VBA script). Nowadays if you want to do something, you look for an app in the App Stores, and if you can't find it, you just don't do it. Instead of computers helping us to work, it's the "walled gardens" determining what we can or cannot do. We are getting dumber because of technology, not smarter. Even worse, not only are we dumber, we are also shallower; technology is trending less towards solving a problem to help humanity, but more towards getting your app to millions of eyeballs so you can flip it to Facebook or Yahoo (the Instagram paradigm: "zero to a billion dollars in 18 months or less"). This is frustrating if you've been observing technology for as long as we have. It's up to our children, the next generation, to reverse this downward spiral.
At work, I've interviewed candidates for investment analyst and associate positions. I've talked to candidates with various bachelors degrees, MBA, MSc -- some good and some not-so-good come from all backgrounds. But I’ve always had a preference for engineers. More so than business majors, engineers tend to think logically, and break down problems into smaller, solvable sub-tasks. This is an an important problem-solving skill, it's what separates us from the animals.
Programming is a lot like that. I'm no professional coder, partly because my (late) realization that real-life coding mostly involves you dealing with legacy codes, running unit tests, completing documentation, and doing code maintenance work. I have tons of respect for people who do this for a living. But as a kid growing up, all the way through college, coding is a skill that sets you apart.
Perhaps the biggest downside of exposing kids to coding early, in my opinion, is the sheer barrier to entry. It takes a lot of code, a lot of work, to create something you can be proud of (GUI Designers help a little, but not really that much), I just hope they don’t get frustrated by the entry barrier. Platforms also change so quickly. I remember back in my days, I played a BASIC turn-based game with two gorillas throwing bananas at each other -- I think it was written by Bill Gates himself! I would read over the (highly readable) code to see if I could change the color of the bananas, or the text, and so on. The internet didn’t exist back then. Later on, I moved up to Java, then to C# .NET. I also did some coding in Mathematica for calculus and computational theory courses. That was more than a decade ago. Nowadays people seem to only code for IOS or Android, neither of which existed 8 years ago. Who's to say that what you teach them now will be useful when they reach college age?
I haven't made up my mind, but perhaps it's better to teach kids to set up a home network. I don't mean just hooking up a modem-router and setting a Wi-Fi password. I'm talking building a Linux box as a server, configuring your firewall, and optimizing access point locations. Configuring Dynamic DNS to enable remote access. Setting up network file sharing for streaming audio. Setting up a backup system. These are not easy jobs, professionals do it for good money. But these produce immediate, tangible results which may put a smile in their young faces. It teaches problem solving, troubleshooting, and sometimes scripting and coding sprinkled for good measure. And these are all useful skills whatever they may end up doing for their careers.
Youngsters are fickle, though; one day they want to learn Linux, the next day they may join a rock band. That's okay. In the end you can only guide them so far. They will pick the path they want to take, we can only nurture.
Happy Valentine's day, everyone @}-->---