How long before the number of developers necessary to build what we can dream about stops growing, starts shrinking, and goes to zero? Will coding as we know it always be around?
If anything has held true about the progress of technology over the last 150 years, it is that one generation's bread and butter tasks become automated and the skill level requirement for participating successfully in the workforce is forced up. We've seen disruption by machines among all sorts of human labor, particularly in the area of "making stuff".
So will that hold true for ones and zeros?
We're spending a lot of time and money increasing the number of people who can write code, calling it an essential skill for the future--but will that last? We've certainly seen technology labor get disrupted by technology itself before. In the 90's, one of the largest areas of growth for skilled labor was in corporate IT infrastructure and support. People flocked to get Cisco certified for jobs that helped get offices physically setup and supported with computers, routers and servers to work. Now, that skillset seems destined to be obsolete, as the network goes from physical to virtual.
After all, lots of the code required to build web services are now available off the open source shelf--and many of the processes formally asked of the code are being worked into the language itself, like with Ruby on Rails. The developer months it takes to get a web service up and running has exponentially decreased over the past decade. What used to take weeks or months can now be at a hackathon. So why should it stop before it goes all the way to zero and we're seeing the vast majority of apps get build without anyone ever touching code?
There are a few "app builders" out there for mobile, enabling anyone to create a basic content driven mobile application without touching code. What will the advancement of such platforms do to the skill requirements for developers going forward?
Will we ever see "Peak Software Developer"?
Despite the app building apps, new development platforms like mobile reverse this trend. The iPhone meant that a ton more devs needed to learn all sorts of new code, without the benefit of having prior frameworks and shortcuts. Perhaps there will always be some new hardware or platform to move to, but what if that normalizes to Android or iOS on every conceivable device? Will enough libraries and other modules get developed to enable development to move to a fully accessible, graphical interface--just like publishing tools for content? If I can use Squarespace today to build sites you couldn't even code up 15 years ago, will the same hold true for interactive applications?
I think you have to ask this of any industry if you're thinking about talent development for the next generation. It's not like people haven't tried replacing VCs with an algorithm. It's well worth the question of why not--what are the factors making this more or less likely.
I ask this, because more and more I'm getting the feeling like coding is no longer the barrier to innovation--no more so than being able to pay for servers or installing a CMS is the barrier to having a good blog. We seem pretty focused one making more developers, but what I find most lacking in the pitches I take isn't tech talent, it's insight, cleverness, and design sensibility. It's not the ability to make that is holding us back--it's what to make and for whom, and how to make it useable. Usually when people figure out what to make, it's pretty makeable. That means the way people are getting ahead isn't speed or raw programming bandwidth--it's design and implementation.
It's pointless to teach everyone to build without teaching them how to decide what's worth building. Architects know this. They learn art history and observe traffic flow studies as part of their efforts to put up a building--not just how beams connect to each other.
I don't ever think we'll ever lose the need to understand and produce raw code--just like I don't think we'll ever stop teaching kids how to spell despite the widespread ability of spell check. If code is the language by which innovation will be spoken in the future, we need to realize that just teaching people how to spell and have good grammar won't produce Shakespeare or the next great American novel. I'd love to see more emphasis on the application of those tools--problem solving, user interaction, business analysis.