I have a long documented love of APIs. Ever since I started programming in the late 1970’s the idea of writing software that interacted with other software was a cool idea to me. Abstractions 30 years ago weren’t very sophisticated, but when I look back at some of the Apple II documentation in my own personal computer museum I am amazed at what you could peek and poke, even back then.
The API has been a long time staple of established companies. It has morphed around plenty – having a long run as the SDK (software development kit) as popularized by Microsoft in the 1980’s. The API in all its naked glory made a nice comeback in the 1990’s and has subsequently become firmly established as an integral part of the Internet. While occasional arguments about REST, SOAP, and XML-RPC appear, most of the time we are happy with whatever API abstraction layer we get.
Many of our Internet-based portfolio companies – such as NewsGator, FeedBurner, and Technorati – have built APIs to their services. However, the API isn’t limited to consumer companies – we’ve had great success with our friend the API at enterprise software companies like Rally Software.
Recently Twitter reminded us just how powerful an API could be. Twitter’s well documented API resulted in an explosion of Twitter add-on applications which have been key to propelling its adoption. FriendFeed followed suit and launched an API shortly after its service was available. It’s no surprise that the founders of Twitter and FriendFeed have a Google heritage – nor is it a surprise that Google’s API machine continues to crank out a remarkable set of APIs for a wide variety of Google services.
Today there is no excuse if you launch a consumer web service without an API. If you do that, to you I say "you suck". Ok – it’s not trivial to scale an API up, but why not design it in from the beginning? If you wake up in a situation where your service (or API) suddenly becomes popular, you have options like Mashery that you can outsource your API to. According to Oren Michels, the CEO of Mashery, a base API package will including – in addition to the actual API code:
- Documentation
- Great examples of what people can do with it (I’d rather steal and modify your examples then start from scratch)
- Self-provisioning (I want to play with it at 3am – don’t make me wait until you are awake to get started)
- Path to success (If I’m using it a lot, don’t throttle me. Give me a way to pay you to use your API more)
I believe we are once again at the beginning of another conceptual shift. The enterprise software world has been talking about SOA’s while the parallel universe of the consumer Internet has been implementing web services and APIs galore. However, now one has really worked through broad API scale issues on an Internet-wide basis. Imagine the following scenario:
You create new a web site called "CoolNewSite". You create an API for CoolNewSite. You want to connect CoolNewSite (via the API) to the other 531,177 other web sites that have APIs. Yes – you realize that only 1,753 of them actually matter, but you’d like to be able to interoperate with all of them, no matter how large or small. So – you get to work writing 531,177 connectors between your API and all the other APIs out there. 13 years into this process, CoolNewSite becomes popular and suddenly you are overwhelmed with traffic. Your solution – start throttling the number of calls that another service can send you in a given time period so that you don’t continually fall over.
Sound – er – familiar?
There are at least two interesting businesses that come out of this problem. Mashery is one; a company we have funded called Gnip is the second. I’ve got a third one in me, but I’m going to think about it a little more and see if it’s really a business or just a feature of Gnip or Mashery.
In the mean time, this is one case where sucking less doesn’t work. Get going on your API.