Brad Feld

Tag: TLA

At the Formlabs Digital Factory event in June, Carl Bass used the phrase Infinite Computing in his keynote. I’d heard it before, but I liked it in this context and it finally sparked a set of thoughts which felt worthy of a rant.

For 50 years, computer scientists have been talking about AI. However, in the past few years, a remarkable acceleration of a subset of AI (or a superset, depending on your point of view) now called machine learning has taken over as the hot new thing.

Since I started investing in 1994, I’ve been dealing with the annual cycle of the hot new thing. Suddenly, a phrase is everywhere, as everyone is talking about, labeling, and investing in it.

Here are a few from the 1990s: Internet, World Wide Web, Browser, Ecommerce (with both a capital E and a little e). Or, some from the 2000s: Web Services, SOAs, Web 2.0, User-Generated Data, Social Networking, SoLoMo, and the Cloud. More recently, we’ve enjoyed Apps, Big Data, Internet of Things, Smart Factory, Blockchain, Quantum Computing, and Everything on Demand.

Nerds like to label things, but we prefer TLAs. And if you really want to see what the next year’s buzzwords are going to be, go to CES (or stay home and read the millions of web pages written about it.)

AI (Artificial Intelligence) and ML (Machine Learning) particularly annoy me, in the same way Big Data does. In a decade, what we are currently calling Big Data will be Microscopic Data. I expect AI will still be around as it is just too generally appealing to ever run its course as a phrase, but ML will have evolved into something that includes the word “sentient.”

In the mean time, I like the phrase Infinite Computing. It’s aspirational in a delightful way. It’s illogical, in an asymptotic way. Like Cloud Computing, it’s something a marketing team could get 100% behind. But, importantly, it describes a context that has the potential for significant changes in the way things work.

Since the year I was born (1965), we’ve been operating under Moore’s Law. While there are endless discussions about the constraints and limitations of Moore’s Law, most of the sci-fi that I read assumes an endless exponential growth curve associated with computing power, regardless of how you index it.

In that context, ponder Infinite Computing. It’s not the same as saying “free computing” as everything has a cost. Instead, it’s unconstrained.

What happens then?