Dec 05, 2016 20:00
For a long time now I’ve looked down on the younger generation of programmers mainly because they use frameworks and libraries willy-nilly without understanding how they work and what exactly they do, and call it "programming", or worse yet, "hacking".
But this year I’ve been realizing that I’m the old geezer on the porch complaining that his generation was somehow different when it was not.
Sure I learned about programming by entering machine language into a console, and went up from there. But I didn’t know jack shit about circuit design, and I still don’t know jack about it. In the past I’ve claimed this was different because circuit design was hardware design, and as a software person I was in a wholly different field, and justified in ignoring what lay beneath it.
But that division only appeared in retrospect, after the messy innovation that spawned the first solid platforms had taken place.
Looking around now, what divisions are starting to take shape? What core fields of study are being placed firmly on the wrong side of those divisions, doomed to fade away into dark corners of the industry?
Here's a list off the top of my head:
* Tomorrow's programmers are going to stop worrying almost entirely about WHERE their code is actually being run. And it will be hard to figure it out in any case.
* Tomorrow's programmers are going to expect software to auto-optimize itself to a huge degree, by having an AI interactively refine their design. The very notion of optimizing something for a given platform will seem quaint.
* Tomorrow's programmers are going to rent all their development tools on a monthly basis. They will be auto-updated every 24 hours. Every keystroke they make while on the clock will be recorded, and much of it will be rewindable and branch-able like a git repository on steroids. Development in an offline state will be severely handicapped, perhaps even impossible, but it won't matter because everything will be online all the time, for almost zero energy cost.
* Tomorrow's programmers are going to expect to be able to take anyone's device anywhere, and with permission, authenticate to it with a fingerprint or iris scan or code key, and instantly start using their own personal development environment, picking up exactly where they left off. When they stand up and move more than 3 feet away from the machine it will sense this and auto-lock, and the programmer can move on to another machine. (This is almost the way it is already, for some online developers working exclusively in browsers.)
What changes do you foresee, that will render large parts of current knowledge, or process, useless or irrelevant?