A post in the New York Times this morning asserted that Software Progress Beats Moore’s Law. It’s a short post, but the money quote is from Ed Lazowska at the University of Washington:
“The rate of change in hardware captured by Moore’s Law, experts agree, is an extraordinary achievement. “But the ingenuity that computer scientists have put into algorithms have yielded performance improvements that make even the exponential gains of Moore’s Law look trivial,” said Edward Lazowska, a professor at the University of Washington.
The rapid pace of software progress, Mr. Lazowska added, is harder to measure in algorithms performing nonnumerical tasks. But he points to the progress of recent years in artificial intelligence fields like language understanding, speech recognition and computer vision as evidence that the story of the algorithm’s ascent holds true well beyond more easily quantified benchmark tests.”
If you agree with this, the implications are profound. Watching Watson kick Ken Jennings ass in Jeopardy a few weeks ago definitely felt like a win for software, but someone (I can’t remember who) had the fun line that “it still took a data center to beat Ken Jennings.”
While that doesn’t really matter because Moore’s Law will continue to apply to the data center, but my hypothesis is that there’s a much faster rate of advancement on the software layer. And if this is true it has broad impacts for computing, and computing enabled society, as a whole. It’s easy to forget about the software layer, but as an investor I live in it. As a result of several of our themes, namely HCI and Glue, we see first hand the dramatic pace at which software can improve.
I’ve been through my share of 100x to 1000x performance improvements because of a couple of lines of code or a change in the database structure in my life as a programmer 20+ years ago. At the time the hardware infrastructure was still the ultimate constraint – you could get linear progress by throwing more hardware at the problem. The initial software gains happened quickly but then you were stuck with the hardware improvements. If don’t believe me, go buy a 286 PC and a 386 PC on eBay, load up dBase 3 on each, and reindex some large database files. Now do the same with FoxPro on each. The numbers will startle you.
It feels very different today. The hardware is rapidly becoming an abstraction in a lot of cases. The web services dynamic – where we access things through a browser – built a UI layer in front of the hardware infrastructure. Our friend the cloud is making this an even more dramatic separation as hardware resources become elastic, dynamic, and much easier for the software layer folks to deploy and use. And, as a result, there’s a different type of activity on the software layer.
I don’t have a good answer as to whether it’s core algorithms, distributed processing across commodity hardware (instead of dedicated Connection Machines), new structural approaches (e.g. NoSql), or just the compounding of years of computer science and software engineering, but I think we are at the cusp of a profound shift in overall system performance and this article pokes us nicely in the eye to make sure we are aware of it.
The robots are coming. And they will be really smart. And fast. Let’s hope they want to be our friends.