Brad Feld

Tag: machine learning

At the Formlabs Digital Factory event in June, Carl Bass used the phrase Infinite Computing in his keynote. I’d heard it before, but I liked it in this context and it finally sparked a set of thoughts which felt worthy of a rant.

For 50 years, computer scientists have been talking about AI. However, in the past few years, a remarkable acceleration of a subset of AI (or a superset, depending on your point of view) now called machine learning has taken over as the hot new thing.

Since I started investing in 1994, I’ve been dealing with the annual cycle of the hot new thing. Suddenly, a phrase is everywhere, as everyone is talking about, labeling, and investing in it.

Here are a few from the 1990s: Internet, World Wide Web, Browser, Ecommerce (with both a capital E and a little e). Or, some from the 2000s: Web Services, SOAs, Web 2.0, User-Generated Data, Social Networking, SoLoMo, and the Cloud. More recently, we’ve enjoyed Apps, Big Data, Internet of Things, Smart Factory, Blockchain, Quantum Computing, and Everything on Demand.

Nerds like to label things, but we prefer TLAs. And if you really want to see what the next year’s buzzwords are going to be, go to CES (or stay home and read the millions of web pages written about it.)

AI (Artificial Intelligence) and ML (Machine Learning) particularly annoy me, in the same way Big Data does. In a decade, what we are currently calling Big Data will be Microscopic Data. I expect AI will still be around as it is just too generally appealing to ever run its course as a phrase, but ML will have evolved into something that includes the word “sentient.”

In the mean time, I like the phrase Infinite Computing. It’s aspirational in a delightful way. It’s illogical, in an asymptotic way. Like Cloud Computing, it’s something a marketing team could get 100% behind. But, importantly, it describes a context that has the potential for significant changes in the way things work.

Since the year I was born (1965), we’ve been operating under Moore’s Law. While there are endless discussions about the constraints and limitations of Moore’s Law, most of the sci-fi that I read assumes an endless exponential growth curve associated with computing power, regardless of how you index it.

In that context, ponder Infinite Computing. It’s not the same as saying “free computing” as everything has a cost. Instead, it’s unconstrained.

What happens then?


One of the consistent characteristics of the tech industry is an endless labelling of technology and approaches. Some of it is foundational resulting from some entirely new. Much of it is re-categorizing something, either because it is suddenly trendy again or because a set of ideas have been organized in a new way. When I was in my 20s, I found this exciting. Now that I’m in my 50s and am used to this, I find it relaxing, as it makes me feel at home.

An example of this is artificial intelligence (or AI). If you teleported here from another planet yesterday, you’d think we just discovered this thing called AI and were creating bots to exercise it while others were writing philosophical treatises to try to figure out how to prevent it from exterminating the human race. If the following names – John McCarthy, Marvin Minsky, Allen Newell, Arthur Samuel and Herbert Simon – don’t mean anything to you and you think you know something about AI, I encourage you to go buy a copy of The Society of the Mind and to set your DMC-12 with a flux capacitor to 1956. If you still don’t know what I’m talking about, that’s cool – just ignore me.

Another example is big data which became all the rage around 2012. I keynoted an Xconomy Conference on Big Data with the opening line “Big Data is Bullshit.” My real quotable comment was “Twenty years from now, the thing we call ‘big data’ will be tiny data. It’ll be microscopic data. The volume that we’re talking about today, in 20 years, is a speck.” Nonetheless, hundreds of big data companies were created and funded.

Within the past two years, the phrase machine learning has taken over as the label de jour. Any reader of science fiction knows that the phrase – and the activity – has been around for a long time. If you have a Tesla, you are probably telling all your friends about how it uses machine learning. There’s even a Stanford course on Coursera about Machine Learning. But, what does it actually mean?

I ran into two awesome blog posts the other day titled Machine Learning is Fun! and Machine Learning is Fun! Part 2. Adam Geitgey, who I don’t know, did a wonderful job of writing about this in an accessible way while evolving examples that includes Super Mario Brothers (from 1985) that goes very deep by way of demonstration.

If you’ve got other great introductory resources for Machine Learning, I encourage you to put links in the comments.