“We cannot afford the advertising business model. The price of free is actually too high. It is literally destroying our society, because it incentivizes automated systems that have these inherent flaws. Cambridge Analytica is the easiest way of explaining why that’s true. Because that wasn’t an abuse by a bad actor — that was the inherent platform. The problem with Facebook is Facebook.”
The article ends with a parallel quote from Tim Berners-Lee, creator of the World Wide Web
“The web that many connected to years ago is not what new users will find today. The fact that power is concentrated among so few companies has made it possible to weaponize the web at scale.”
I just read the article and all of the attached long-form interviews. I think my favorite, only because it’s so provocative, is the one with Roger McNamee titled ‘You Have a Persuasion Engine Unlike Any Created in History’
There are a few mentions of Zynga (which we were investors in) in the various article chain which caused me to reflect even more on the 2007 – 2010 time period when free-to-consumer (supported by advertising) was suddenly conflated with freemium (or free trials for enterprise software). The later (freemium) became a foundational part of the B2B SaaS business model, while the former became an extremely complex dance between digital advertising and user data.
Tristan’s quote “the price of free is actually too high” is important to consider. What is going on here (“free services”) is nothing new. The entire television industry was created on it (broadcast TV was free, supported by advertising, dating back well before I was born.) Nielsen ratings started for radio in the 1940s and TV in the 1950s. The idea of advertisers targeting users of free services based on data is, well, not new.
Propaganda is not new either. The etymology of the word from Wikipedia is entertaining in its own right.
“Propaganda is a modern Latin word, the gerundive form of propagare, meaning to spread or to propagate, thus propaganda means that which is to be propagated.Originally this word derived from a new administrative body of the Catholic church (congregation) created in 1622, called the Congregatio de Propaganda Fide (Congregation for Propagating the Faith), or informally simply Propaganda. Its activity was aimed at “propagating” the Catholic faith in non-Catholic countries From the 1790s, the term began being used also to refer to propaganda in secular activities. The term began taking a pejorative or negative connotation in the mid-19th century, when it was used in the political sphere.”
So what? Why the fuss? A cynic would say something like “this is not what the hippy-techies of the 60s wanted.” True, that. But the arch of human society is littered with outcomes that diverge wildly from the intended actions. Just watch Game of Thrones or Homeland to get a feeling for that, unless you struggle with conflating fact and fiction, which seems less of a problem for many people every day based on the information we consume and regurgitate.
I think something more profound is going on here. We are getting a first taste of how difficult it is for a world in which humans and computers are intrinsically linked. Tristian’s punch line “The problem with Facebook is Facebook” hints at this. Is the problem the leadership of Facebook, the people of Facebook, the users of Facebook, the software of Facebook, the algorithms of Facebook, what people do with the data from Facebook, or something else. Just try to pull those apart and make sense of it.
I think this is a pivotal moment for humans. I’ve heard the cliche “the genie can’t be put back in the bottle” numerous times over the past few weeks. Any reader of Will and Ariel Durant know that the big transitions are hard to see when you are in them but easy to see with the benefit of decades of hindsight. This might be that moment of transition, where there is no going back to what was before.
We (the tech industry) like to label everything. I attribute the source of this desire and need to Regis McKenna although he may have just been the genius that amplified it.
The labels I dealt with early in my professional career (the 1980s) included micro computers, mini-computers, artificial intelligence, expert systems, neural networks, middleware, super computers, parallel computing, and killer app. Oh – and groovy. And music by Boston, Journey, Rush, Pink Floyd, and AC/DC.
When we invested in Fitbit in 2010, the phrase we used to describe the product was human instrumentation. If you read the original post, you’ll be amused by the lack of marketing language for what, in a few years, would evolve through labels like quantified self and wearables. And yes, I still call it human instrumentation (as a subset of human computer interaction), since that’s the part that is interesting to me.
BodyHacking and BioHacking and trendy labels for this. They’ve long been a favorite troupe of the sci-fi that I enjoy and are now regularly showing up in sci-fi movies. One of the annual conferences, BDYHAX, even has a description that fits with the notion of transhumanism.
BDYHAX is 3-day celebration of human enhancement, transhumanism, and biohacking. With a special focus on DIY healthcare and other body hacks, BDYHAX brings together industry experts, curious newcomers, and everyone else in between.
Mom / Dad – do these words skeeve you out? I’m betting they do. Or, at the minimum, you feel detached from them. It is, in this way, that I think the tech industry, with their labels, are doing humanity a great injustice on this topic.
Here are some common bodyhacks that we’ve been doing for a long time.
You get the idea.
I think part of the problem might be gender. Go read the following post by Kate Preston McAndrew titled Vagina, vagina, vagina.* (the subtitle is “Redesigning the pelvic exam experience“). Kate starts the post strong.
“Gender disparity is real, and traditionally, medical equipment designers have tended to have penises. That is problematic on a general level, but specifically, it means that problems that are specific to vaginas are often ignored or overlooked.”
I hadn’t connected this issue to the labels we use until I read the post. The post is outstanding, especially in the use of language and the unfolding of the thought process around the product. While reading it, I felt like I was in an alternate universe from the typical conversation I have about products. It was awesome.
Tech (hardware and software) is being interwoven into everything we do as a human species. To make this accessible to everyone, maybe we should start working a little harder on the words. More meaning, and less either (a) tech or (b) marketing. Ponder that all your cryptowarriors out there. Or members of any particular technology company mafia. And those of you in ecosystems.
What are you really trying to say?
Looking Glass, a Brooklyn-based company we recently led the Series A investment in, just released HoloPlayer One, the world’s first interactive lightfield development kit. This is a new interface that lets groups of people see and interact with floating 3D scenes without VR or AR headgear. While it’s an early release dev kit, it’s still as close to achieving the dream of the hologram shown in Blade Runner 2049 as I’ve seen.
This is relevant in my world because an investment theme we think a lot about is Human Computer Interaction. While it’s dangerous to try to predict the future, I think it’s a safe bet that in 20 years humans won’t be interacting with computers in the same way they are now. Amazon Echo is an example of one massive HCI shift that will impact our lives for years to come. Looking Glass is betting that another HCI shift will be related to how people interact with 3D content, like how a doctor will show a patient a CAT scan or how a 3D modeller will rig a Pixar character or design a rocket engine.
There are a lot of people who see this interface shift on the horizon with billions of dollars flowing into AR and VR companies evidence of this general interest. But what if there was a way to do it without the cost and constraints of a VR or AR headset.
The Looking Glass founders Shawn and Alex have been obsessed with chasing this dream since they were kids. Now they’re betting deeply against the headgear-based VR/AR trend by saying that holograms will be the next shift in human computer interaction. And they want fellow hologram hackers along for the ride.
I just got one (well, another one – we already have two HoloPlayer prototypes in the office with Structure Sensor scans of all the Foundry partners).
You can pre-order your Holoplayer dev kit here. Save $50 with code TOTHEFUTURE.
He took his vorpal sword in hand:
Long time the manxome foe he sought —
So rested he by the Tumtum tree,
And stood awhile in thought.
– from Lewis Carroll, Jabberwocky
One, two! One, two! And through and through
The vorpal blade went snicker-snack!
He left it dead, and with its head
He went galumphing back.
I can almost see Obi-Wan swinging his lightsaber.
It delights me that we’ve invested in a company called Looking Glass who is making their own version of a vorpal sword.
Well, ok, it’s a volumetric display. But we’ll get there …
We’ve been investing in stuff around 3D since we started Foundry Group in 2007. Our first 3D-related investment was Oblong, which has reinvented the way we engage with computers (which we call infopresence) through the use of their 3D spatial operating system called g-speak and their collaboration product Mezzanine.
Well before the current generation of VR/AR/MR/XR/whateverR came about, we focused our attention and investing in the notion of a radical change in human computer interaction (HCI). We believed that in 2007 we were at the beginning of a 30+ year shift that would make the WIMP interface, which emerged in the early 1980s and was dominant in 2007, look and feel punch-card archaic in the future.
While we dig the moniker XR (for extended reality), we are much more interested in, well, reality. Our investments in 3D printing, first with MakerBot (the first successful consumer 3D printer) and now with Formlabs and Glowforge, cross the boundary between designing in 3D and making physical things. Our investment in Occipital has changed how we, and many others, think about 3D inputs and what to do with them. And life wouldn’t be much fun if you couldn’t play Rock Band in 3D, so Harmonix has you covered there.
So, why Looking Glass? After Stratasys acquired MakerBot for over $400m in 2013, we didn’t pay much attention to 3D printing for a few years. But, in 2015, when we invested in Glowforge, we realized that we had only begun to play out physical interaction with 3D. The industrial laser cutter market presented the same opportunity as the industrial 3D printer market, and hence our investment in the first 3D Laser Printer.
In 2016, when we invested in Formlabs, we had another insight that was reinforced by one of the ubiquitous Gartner Hype Cycle graphs. I think it speaks for itself.
We are now enjoying market leadership during the plateau of productivity.
One day, I was in Jeff Clavier’s office at SoftTech VC in San Francisco. He made me sit down with Shawn Frayne, the CEO of Looking Glass. Thirty minutes later, I called John Underkoffler, the CEO of Oblong, and said “John, I finally saw what you were trying to create with your holographic camera.”
And, as a bonus, the physical camera, which for over 20 years lived in the basement of my close friend Warren Katz’s house, now lives in my Carriage House in Longmont. It’s in several pieces, but that’s a detail that some day John will remedy.
It was an easy decision to invest in Looking Glass.
`Twas brillig, and the slithy toves
Did gyre and gimble in the wabe;
All mimsy were the borogoves,
And the mome raths outgrabe.
It’s here. And you know you want it. You can buy just the Rock Band 4 software (if you have your old instruments) or, if you are like me and you’ve given your instruments away, you can buy a new full bundle of everything.
And, in case you missed it, Spark Capital joined us an investor last week with a few other long time friends in a $15 million round.
I originally invested in Harmonix as an angel investor in 1995. It’s rise was well chronicled in this awesome Inc. Magazine long form story titled Just Play. Basically, Harmonix tried to go out of business every year between 1995 and 2005 and just managed to fail at that, always coming up with a new revenue deal or a small amount of financing to stay alive before it became an overnight success in 2005 with the original launch of Guitar Hero.
MTV acquired the company in 2006 for $175m plus an earnout, which after a long “discussion” that ended in 2013, resulted in a total purchase price over $700m. MTV decided to get out of the video game business in 2010 and sold the company back to the founders (Alex and Eran) and a small investor group.
In 2013 Alex and Eran asked me to join their board. We arranged a financing that made sense for both parties so Foundry Group could invest. Harmonix is easily the most accomplished video game company in the world around music and rhythm games and with the eventual, and long awaited emergence of VR, I can think of no better company around our HCI theme to work with. Spark Capital, which was one of the original investors in Occulus, agrees, which makes me very happy.
Rock Band 4 is now out. In states like Colorado where a certain substance is now legal, I expect we’ll have a new marketing tie in. In the rest of the world, let me just suggest that having played the new game, you’ll want to get a copy and dust off your old equipment.
And get ready for some stuff that is just going to blow your mind – now and over the next 12 months – from my friends at Harmonix in Boston.
When we were approached with an investment opportunity by Matt Van Horn and Nikhil Bhogal in 2014, they started with a single, lighthearted but thought-provoking question:
Why does your kitchen look the same as Don Draper’s?
There has been little significant innovation in how we prepare and cook food at home since the microwave oven. In recent years, we’ve been delighted by in-home products like those from Nest and Sonos which have tested the waters of the connected home market and proven that it’s there.
My TV, thermostat, light system, security cameras, and even my 3D Printer is a delight to use. But what about the kitchen? Amy and I continue to complain to each other about how miserable the user interfaces are on the very expensive stuff in our kitchen and the blinking 12:00 on my super high end Miele oven crushed my soul recently. Each year we expect to see something amazing at CES only to encounter proof of concepts from large companies that George Jetson wouldn’t even be happy using.
When we met with Matt and Nikhil a year ago, they gave us a vision for what could be done in the kitchen around the notion of the connected home. Their vision has come to life with their first product – the June Intelligent Oven, a computer-based powerful and easy-to-use countertop oven.
This thing works like magic. Except, of course, there is no magic involved, just computers and software and an awesome oven. Some of the biggest brains in hardware and software (from teams like Apple, Google, Facebook, Nest, and GoPro) as well as product designers and chefs, joined forces in a house in San Francisco and worked in stealth for a year. This week they are showing the world the June Oven.
You can pre-order to save a spot in line and they have a referral program to help you and your friends shave a little money off, but I’ll let you read more about that on their site. Also, use the promocode BRADFELD to get an additional $100 off the final shipping order.
This team is fearless. They put a camera in a box that heats to 450 degrees fahrenheit and made it safe to touch. It recognizes the mostly commonly cooked food you put in it and automatically configures itself. The cooking process is live streamed so you can watch and control the oven from the couch.
It’s just awesome to see it all come together. I just pre-ordered two of them. Join me in the fun.
I did a really fun hour long interview with Nikola Danaylov – who goes by Socrates – on the Singularity Weblog. We covered a wide range of topics around humans, machines, the singularity, where technology is going, and some philosophy around the human race and it’s inevitable Cylon future.
This was one of the more stimulating set of questions I’ve had to address recently. My fundamental message – “be optimistic.” Enjoy!
On Saturday, I read the final draft of a magnificent book by David Rose. The book is titled Enchanted Objects: Design, Human Desire and the Internet of Things.
I’ve known David for many years. I was a huge fan and an early customer, but not an investor, in one of his companies (Ambient Devices) and we share a lot of friends and colleagues from MIT and the Media Lab. I was happy to be asked to blurb his book and then absolutely delighted with the book. It captured so many things that I’ve been thinking about and working on in a brilliantly done 300 page manuscript.
The basic premise of the book is that ultimately we want “enchanted objects”, not “glass slabs” to interact with. Our current state of the art (iMacs, iPhones, Android stuff, Windows tablets, increasing large TV screens) are all glass slabs. The concept of the “Internet of Things” introduces the idea of any device being internet connected, which is powerful, but enchanted objects take it one step further.
Now, the irony of it is that I read David’s book on a glass slab (my Kindle Fire, which is currently my favorite reading device.) But page after page jumped out at me with assertions that I agreed with, examples that were right, or puzzle pieces that I hadn’t quite put together yet.
And then on Saturday night it all hit home for me with a real life example. I was lying on the couch reading another book on my Kindle Fire at about 10pm. I heard a chirp. I tried to suppress it at first, but after I heard the second one I knew it was the dreaded chirp of my smoke detector. I continued to try to deny reality, but a few chirps later Amy walked into the room (she had already gone to bed) and said “do you hear what I hear?” Brooks the Wonder Dog was already having a spaz attack.
I got up on a chair and pulled the smoke alarm off the ceiling. I took out the 9V battery and was subject to a very loud beep. We scavenged around for 9V batteries in our condo. We found about 200 AAs and 100 AAAs but no 9Vs. Chirp chirp. We bundled up (it was 2 degrees out) and walked down the street to the Circle K to buy a 9V battery. They only had AAs. We walked back home, got in the car (with Brooks, who was now a complete mess from all the beeping) and drove to King Soopers. This time we got about 20 9Vs. We got home and I got back on the chair and wrestled with the battery holder. After the new battery was in the beeping continued. Out of frustration, I hit the “Test” button, heard a very loud extended beep, and then silence. At least from that smoke alarm.
Chirp. It turns out that I changed the battery in the wrong one. The one that was chirping was in another room. This one was too high for a chair, which resulted in us having to go into our storage cage in the condo basement and get a ladder. There was a padlock on our cage – fortunately the four digit code was one of the ones that everyone in the world who knows us knows. Eventually, with the ladder, the new batteries, and some effort I got the chirping to stop.
We have those fancy white smoke alarms that are wired directly into the power of the house. I have no idea why they even need a battery. The first thing they do when they want your attention is to make an unbelievably obnoxious noise. Then, they are about as hard as humanly possible to silence. They generate one emotion – anger.
Not an enchanted object.
In comparison, Nest is trying to make an enchanted object our of their new smoke detector product. After reading the Amazon reviews, I realize this is an all or nothing proposition and after spending $30 on 9V batteries and then changing all of the ones in the existing smoke detectors I don’t feel like spending $550 to replace the four smoke detectors in my condo. Plus, the one I want – the wired one – isn’t in stock. So I’ll wait one product cycle, or at least until the beeping crushes my soul again.
We’ve got a bunch of investments in our human computer interaction them that aspire to be enchanted objects including Fitbit, Modular Robotics, LittleBits, Orbotix, and Sifteo. I’m going to start using David’s great phrase “enchanted objects” to describe what I’m looking for in this area. And while I’ll continue to invest in many things that improve our glass slab world, I believe that the future is enchanted objects.
We recently invested in littleBits. It’s another of our investments that traces its roots to the MIT Media Lab. It’s also another investment we are making with our friends from True Ventures. It’s another one that mixes hardware and software in a delightful way that is part of our human computer interaction theme. And yet another investment in New York.
Ayah Bdeir, the CEO of littleBits, has blown my mind with her vision of where she is going to take this company. Phase 1 of littleBits was, in the company’s words, creating a “library of electronic modules that snap together with tiny magnets for prototyping, learning, and fun.” Today there are over 50 different bits that you can buy right now, individually or bundled in different kits.
This, by itself, is awesome. But the next phase of where Ayah is taking the company is just awesome. And, as a result, I predict you will have some littleBits somewhere in your world before you realize it. And, since Thanksgiving is just around the corner, we’ve got a kit to make a programmable lazy susan for your table if you need one.
Remember, the machines have already taken over. Get on board if you want to be able to play with them.
This is a picture of me completely and unapologetically engrossed in a game of Space Invaders on a VIC 20. Here’s an early commercial for it, featuring the one and only William Shatner.
Several weeks ago the team at the Media Archeology Lab (MAL) celebrated their accomplishments to date by hosting an event – called a MALfunction – for the community. Attendees include founders of local startups, the Dean of the College of Arts and Sciences of the University of Colorado, students that are interested in computing history, and a few other friends. The vibe was electric – not because there were any open wires from the machines – because this was truly a venue and a topic that is a strong intersection between the university and the local tech scene.
Recently, Amy and I underwrote the Human Computer Interaction lab at Wellesley University. We did so not only because we believe in facilitating STEM and IT education for young women, but also because we both have a very personal relationship to the university and to the lab. Amy, on a weekly basis, speaks to the impact that Wellesley has on her life. I, obviously, did not attend Wellesley but I have a very similar story. My interest in technology came from tinkering with computers, machines, and software in the late 1970s and early 1980s, just like the collection that is curated by the MAL.
Because of this, Amy and I decided to provide a financial gift to the MAL as well as my entire personal computer collection which included an Apple II (as well as a bunch of software for it), a Compaq Portable (the original one – that looks like a sewing machine), an Apple Lisa, a NeXT Cube, and my Altair personal computer.
Being surrounded by these machines just makes me happy. There is a sense of joy to be had from the humming of the hard drives, the creaking of 30-year old space bars, and squinting at the less than retina displays. While walking back to my condo from the lab, I think I pinned down what makes me so happy while I’m in the lab. An anachronistic experience with these machines are: (1) a reminder of how far we have come with computing, (2) a reminder to never take computing for granted – it’s shocking what the label “portable computer” was applied to in 1990, and (3) a perspective of how much further we can innovate.
My first real computer was an Apple II. I now spend the day in front of an iMac, a MacBook Air, and an iPhone. When I ponder this, I wonder what I’ll be using in 2040? The experience of the lab is one of true technological perspective and those moments of retrospection make me happy.
In addition, I’m totally blown away by what the MAL director, Lori Emerson, and her small team has pulled off with zero funding. The machines at MAL are alive, working, and in remarkably good shape. Lori, who teaches English full time at CU Boulder, has created a remarkable computer history museum.
Amy and I decided to adopt MAL, and the idea of building a long term computer history museum in Boulder, as one of our new projects. My partner Jason Mendelson quickly contributed to it. If you are up for helping us ramp this up, there are three things you can do to help.
1. Give a financial gift via the Brad Feld and Amy Batchelor Fund for MAL (Media Archeaology Lab).
2. Contribute old hardware and software, especially stuff that is sitting in your basement.
3. Offer to volunteer to help get stuff set up and working.
If you are interested in helping, just reach out to me or Lori Emerson.