Brad Feld

Tag: software engineering

Now that our federal government is back at work and the short term debt ceiling thing is resolved, it should be no surprise that the news cycle is now obsessed with Obamacare and its flawed implementation. Over the weekend I must have seen a dozen articles about this online and in the NY Times, and then I woke up this morning to a bunch of new things about the Healthcare.gov site underlying tech, how screwed up it is, and what / how the Health and Human Services agency is going to do to fix it.

The punch line – a tech surge.

To ensure that we make swift progress, and that the consumer experience continues to improve, our team has called in additional help to solve some of the more complex technical issues we are encountering.

Our team is bringing in some of the best and brightest from both inside and outside government to scrub in with the team and help improve HealthCare.gov.  We’re also putting in place tools and processes to aggressively monitor and identify parts of HealthCare.gov where individuals are encountering errors or having difficulty using the site, so we can prioritize and fix them.  We are also defining new test processes to prevent new issues from cropping up as we improve the overall service and deploying fixes to the site during off-peak hours on a regular basis.

From my perspective, this is exactly the wrong thing to do. Many years ago I read Fredrick Brooks iconic book on software engineering – The Mythical Man-Month. One of his key messages is that adding additional software engineers to an already late project will just delay things more. I like to take a different approach – if a project is late, take people off the project, shrink the scope, and ship it faster.

I think rather than a tech surge, we should have a “tech retreat and reset.” There are four easy steps.

  • 1. Shut down everything including taking all the existing sites offline.
  • 2. Set a new launch date of July 14, 2014.
  • 3. Fire all of the contractors.
  • 4. Hire Harper Reed as CTO of Healthcare.gov, give him the ball and 100% of the budget, and let him run with it.

If Harper isn’t available, ask him for three names of people he’d put in charge of this. But put one person – a CTO – in charge. And let them hire a team – using all the budget for individual hires, not government contractors or consulting firms.

Hopefully the government owns all the software even though Healthcare.gov apparently violates open source licenses. Given that, the new CTO and his team can quickly triage what is useful and what isn’t. By taking the whole thing offline for nine months, you aren’t in the hell of trying to fix something while it’s completely broken. It’s still a fire drill, but you are no longer inside the building that is burning to the ground.

It’s 2013. We know a lot more about building complex software than we did in 1980. So we should stop using approaches from the 1980s, admit failure when it happens, and hit reset. Doing a “tech surge” will only end in more tears.


A post in the New York Times this morning asserted that Software Progress Beats Moore’s Law. It’s a short post, but the money quote is from Ed Lazowska at the University of Washington:

“The rate of change in hardware captured by Moore’s Law, experts agree, is an extraordinary achievement. “But the ingenuity that computer scientists have put into algorithms have yielded performance improvements that make even the exponential gains of Moore’s Law look trivial,” said Edward Lazowska, a professor at the University of Washington.

The rapid pace of software progress, Mr. Lazowska added, is harder to measure in algorithms performing nonnumerical tasks. But he points to the progress of recent years in artificial intelligence fields like language understanding, speech recognition and computer vision as evidence that the story of the algorithm’s ascent holds true well beyond more easily quantified benchmark tests.”

If you agree with this, the implications are profound. Watching Watson kick Ken Jennings ass in Jeopardy a few weeks ago definitely felt like a win for software, but someone (I can’t remember who) had the fun line that “it still took a data center to beat Ken Jennings.”

While that doesn’t really matter because Moore’s Law will continue to apply to the data center, but my hypothesis is that there’s a much faster rate of advancement on the software layer. And if this is true it has broad impacts for computing, and computing enabled society, as a whole. It’s easy to forget about the software layer, but as an investor I live in it. As a result of several of our themes, namely HCI and Glue, we see first hand the dramatic pace at which software can improve.

I’ve been through my share of 100x to 1000x performance improvements because of a couple of lines of code or a change in the database structure in my life as a programmer 20+ years ago. At the time the hardware infrastructure was still the ultimate constraint – you could get linear progress by throwing more hardware at the problem. The initial software gains happened quickly but then you were stuck with the hardware improvements. If don’t believe me, go buy a 286 PC and a 386 PC on eBay, load up dBase 3 on each, and reindex some large database files. Now do the same with FoxPro on each. The numbers will startle you.

It feels very different today. The hardware is rapidly becoming an abstraction in a lot of cases. The web services dynamic – where we access things through a browser – built a UI layer in front of the hardware infrastructure. Our friend the cloud is making this an even more dramatic separation as hardware resources become elastic, dynamic, and much easier for the software layer folks to deploy and use. And, as a result, there’s a different type of activity on the software layer.

I don’t have a good answer as to whether it’s core algorithms, distributed processing across commodity hardware (instead of dedicated Connection Machines), new structural approaches (e.g. NoSql), or just the compounding of years of computer science and software engineering, but I think we are at the cusp of a profound shift in overall system performance and this article pokes us nicely in the eye to make sure we are aware of it.

The robots are coming. And they will be really smart. And fast. Let’s hope they want to be our friends.