In the super cool thing category, it’s always fun to see two companies we are investors in easily put together a demonstration of the integration of their products.
Oblong is a master of spatially interacting with 3D data on a 2D display. Looking Glass lets you interact with 3D data in a 3D display with their product (the Looking Glass.)
Together, you can move and interact with 3D images on either a 3D display or a 2D display.
Now, if we could only print this out on a 3D printer. Hmmm.
If you are an NCIS fan, you are probably excited about the upcoming 48 Hours: NCIS which premieres on Tuesday, April 25, 2017, 10pm ET/PT. I like NCIS, but I’m especially excited about Oblong’s Mezzanine product being a central part of the show.
John Underkoffler has been showing us – through Hollywood – the future of user experiences since Minority Report (where he was the science and technology advisor.) It makes me smile to see him, and his gang at Oblong, continue to lead the way.
He took his vorpal sword in hand:
Long time the manxome foe he sought —
So rested he by the Tumtum tree,
And stood awhile in thought.
– from Lewis Carroll, Jabberwocky
Jabberwocky and the vorpal sword always makes me think of Princess Leia saying “Help me Obi-Wan Kenobi you’re my only hope.”
One, two! One, two! And through and through
The vorpal blade went snicker-snack!
He left it dead, and with its head
He went galumphing back.
I can almost see Obi-Wan swinging his lightsaber.
It delights me that we’ve invested in a company called Looking Glass who is making their own version of a vorpal sword.
Well, ok, it’s a volumetric display. But we’ll get there …
We’ve been investing in stuff around 3D since we started Foundry Group in 2007. Our first 3D-related investment was Oblong, which has reinvented the way we engage with computers (which we call infopresence) through the use of their 3D spatial operating system called g-speak and their collaboration product Mezzanine.
Well before the current generation of VR/AR/MR/XR/whateverR came about, we focused our attention and investing in the notion of a radical change in human computer interaction (HCI). We believed that in 2007 we were at the beginning of a 30+ year shift that would make the WIMP interface, which emerged in the early 1980s and was dominant in 2007, look and feel punch-card archaic in the future.
While we dig the moniker XR (for extended reality), we are much more interested in, well, reality. Our investments in 3D printing, first with MakerBot (the first successful consumer 3D printer) and now with Formlabs and Glowforge, cross the boundary between designing in 3D and making physical things. Our investment in Occipital has changed how we, and many others, think about 3D inputs and what to do with them. And life wouldn’t be much fun if you couldn’t play Rock Band in 3D, so Harmonix has you covered there.
So, why Looking Glass? After Stratasys acquired MakerBot for over $400m in 2013, we didn’t pay much attention to 3D printing for a few years. But, in 2015, when we invested in Glowforge, we realized that we had only begun to play out physical interaction with 3D. The industrial laser cutter market presented the same opportunity as the industrial 3D printer market, and hence our investment in the first 3D Laser Printer.
In 2016, when we invested in Formlabs, we had another insight that was reinforced by one of the ubiquitous Gartner Hype Cycle graphs. I think it speaks for itself.
We are now enjoying market leadership during the plateau of productivity.
One day, I was in Jeff Clavier’s office at SoftTech VC in San Francisco. He made me sit down with Shawn Frayne, the CEO of Looking Glass. Thirty minutes later, I called John Underkoffler, the CEO of Oblong, and said “John, I finally saw what you were trying to create with your holographic camera.”
Did I mention that John was one of the inventors, in 1990, of the holographic camera?
And, as a bonus, the physical camera, which for over 20 years lived in the basement of my close friend Warren Katz’s house, now lives in my Carriage House in Longmont. It’s in several pieces, but that’s a detail that some day John will remedy.
It was an easy decision to invest in Looking Glass.
`Twas brillig, and the slithy toves
Did gyre and gimble in the wabe;
All mimsy were the borogoves,
And the mome raths outgrabe.
My friends at Oblong have been involved in some very cool new installations of their Mezzanine product in different vertical markets. They are suddenly seeing a lot of interest from hospitals and health care systems.
A recent new installation is the Mercy Virtual Care Center, which is the world’s first virtual care center.
If you are at HIMSS this week in Las Vegas, go visit Oblong at booth #10725 on the show floor and see the future of collaboration in action.
As I read about the unveiling of the Tesla Model X, I have two thoughts. The first one is “I want” (hint: Amy – you need to replace your red Range Rover.) The second is that price of admission is an amazing product.
Indulge me while I go on an amazing product rant from our portfolio.
I could keep going but you get the idea. When I reflect on our successful investments, regardless of the form factor (software or hardware or both) that they take, they all are amazing products. And the founders come from a product first mindset – their goal is to unambiguously create the best product that delight users every time they come in contact with it.
I’ve heard the discussion about how important product is for over 20 years of being an investor. But it’s not important anymore. Instead, an amazing product is simply price of admission. If you don’t have an amazing product, you don’t get to play, at least in my little corner of the world.
Strong AI has been on my mind a lot lately. We use weak AI all the time and the difference between then two has become more apparent as the limitations, in a particular context, of an application of weak AI (such as Siri) becomes painfully apparent in daily use.
When I was a student at MIT in the 1980s, computer science and artificial intelligence were front and center. Marvin Minsky and Seymour Papert were the gods of MIT LCS and just looking at what happened in 1983, 1984, and 1985 at what is now CSAIL (what used to be LCS/AI) will blow your mind. The MIT Media Lab was created at the same time – opening in 1985 – and there was a revolution at MIT around AI and computer science. I did a UROP in Seymour Papert’s lab my freshman year (creating Logo on the Coleco Adam) and took 6.001 before deciding to do Course 15 and write commercial software part-time while I was in school. So while I didn’t study at LCS or the Media Lab, I was deeply influenced by what was going on around me.
Since then, I’ve always been fascinated with the notion of strong AI and the concept of the singularity. I put myself in the curious observer category rather than the active creator category, although a number of the companies I’ve invested in touch on aspects of strong AI while incorporating much weak AI (which many VCs are currently calling machine learning) into what they do. And, several of the CEOs I work with, such as John Underkoffler of Oblong, have long histories working with this stuff going back to the mid-1980s through late 1990s at MIT.
When I ask people what the iconic Hollywood technology film about the future of computing is, the most common answer I get is Minority Report. This is no surprise to me as it’s the one I name. If you are familiar with Oblong, you probably can make the link quickly to the idea that John Underkoffler was the science and tech advisor to Spielberg on Minority Report. Ok – got it – MIT roots in Minority Report – that makes sense. And it’s pretty amazing for something done in 2002, which was adapted from something Philip K. Dick wrote in 1956.
Now, fast forward to 2014. I watched three movies in the last year purportedly about strong AI. The most recent was Her, which Amy, Jenny Lawton, and I watched over the weekend, although we had to do it in two nights because we were painfully bored after about 45 minutes. The other two were Transcendence and Lucy.
All three massively disappointment me. Her was rated the highest and my friends seemed to like it more, but I found the portrayal of the future, in which strong AI is called OS 1, to be pedantic. Samantha (Her) had an awesome voice (Scarlett Johansson) but the movie was basically a male-fantasy of a female strong AI. Lucy was much worse – once again Scarlett Johansson shows up, this time as another male fantasy as she goes from human to super-human to strong AI embodied in a sexy body to black goo that takes over, well, everything. And in Transcendence, Johnny Depp plays the sexy strong character that saves the femme fatale love interest after dying and uploading his consciousness, which then evolves into a nefarious all-knowing thing that the humans have to stop – with a virus.
It’s all just a total miss in contrast to Minority Report. As I was muttering with frustration to Amy about Her, I wondered what the three movies were based on. In trolling around, they appear to be screenplays rather than adaptations of science fiction stories. When I think back to Philip K. Dick in 1956 to John Underkoffler in 2000 to Stephen Spielberg in 2002 making a movie about 2054, that lineage makes sense to me. When I think about my favorite near term science fiction writers, including William Hertling and Daniel Suarez, I think about how much better these movies would be if they were adaptations of their books.
The action adventure space opera science fiction theme seems like it’s going to dominate in the next year of Hollywood sci-fi movies, if Interstellar, The Martian (which I’m very looking forward to) and Blackhat are any indication of what is coming. That’s ok because they can be fun, but I really wish someone in Hollywood would work with a great near-term science fiction writer and a great MIT (or Stanford) AI researcher to make the “Minority Report” equivalent for strong AI and the singularity.
If you are in NY on 7/24 and want to have your mind blown by one of my favorite companies ever, go to the Oblong NYC Open House to see their new demo center.
In addition to an amazing demo and good food, Christopher Walsh (Director of Product Effectiveness for McGraw Hill Financial S&P Capital IQ) is going to be talking about how his organization uses Oblong’s Mezzanine to change the way they work.
It’s Thursday, July 24th from 5:30-8:30pm EST. Register here.
I get demos every day. Multiple times a day. I don’t want to see a powerpoint deck – I want to play with something. I don’t want to hear a description of what you do – I want to see a demo. I don’t want you to tell me your background, where you went to school, or where your grew up. I want to see what you are working on.
I still remember my first meeting with Bre Pettis at MakerBot. I walked into the Botcave in Brooklyn and was confronted with a long, narrow Brooklyn-style industrial building where I could see people working away in the back. But before I got to them, I had to walk through a 1000 sq. ft. area of MakerBot Thing-O-Matics printing away. This was an early “bot farm” and it probably took 15 minutes before I walked the gantlet. They were printing all kinds of things, there were display cases of other stuff that had been printed, and a vending machine for Thing-O-Matic parts.
When I got to the back where people were working, I totally understood what MakerBot did and what was possible with 3D printing.
We are lucky to be investors in a bunch of companies creating amazing new products. One of them, Oblong, as been working on spacial computing since John Underkoffler’s early research in the 1990’s at the MIT Media Lab. For a number of years they were described the “Minority Report” technology (John was the science/tech advisor to Spielberg and came up with all the tech in the movie.) The following video is John showing off and explaining the core G-Speak technology.
The demo is iconic and amazing, but it takes too long and is too abstract for their corporate customers buying Oblong’s Mezzanine product. The short five minute “overview video” follows.
While this gives you a feel for things, it’s still showing the “features and functionality” of the tech, applying a general use case. For several months, I kept banging on them to set up a simple use case, which is the how I use the Mezzanine system in our office. I use it every day and it’s been a huge factor for me in eliminating all of my travel.
A few months ago, Oblong had a sales off-site to go through the progress they’ve made this year and to focus on the balance of the year. They’ve had a great year, with a strong quarter-over-quarter sales ramp for Mezzanine on both a dollar and unit basis. The customer list is incredible, their classical enterprise land and expand strategy is working great, and new high-value use cases are being defined with each customer. So I smiled when I the following slide popped up on my Mezzanine during our weekly leadership team call.
While a little abstract in writing (I don’t expect you to understand the first three bullet points unless you know how Mezzanine works), when it’s shown in the first five minutes of a demo it simply blows your mind. And you totally get all three of the core technologies that Oblong has incorporated in Mezzanine (spatial computing, pixel virtualization, and data pipelining.) Your next reaction is “I want one.” And then you are ready for the feature / function discussion, which can easily go on for 30 minutes.
There is endless talk about product development and getting “personas developed” while you figure out how to build your product for them. This approach is equally useful for demos, but it is so often overlooked. I can’t tell you the number of times people start just showing me stuff, rather than saying “here’s the problem I’m going to solve for you that I know you have” – BOOM – and then I’m totally captured for the next 30 minutes.
Try it. The first five minutes is the most important with someone like me. Don’t waste it.
I stopped travelling mid-May (I arrived home in Boulder from San Francisco on 5/17). I’ve decided not to travel at all for the rest of 2013, except for three personal trips (my parents 50th anniversary, Amy’s birthday, and my birthday.) After travelling 50% – 75% of the time for the last 20 years, I needed a break.
It has been awesomely mindblowingly great to not travel.
I’ve had three other periods of extended no-travel in the last 20 years. I stopped travelling for three months after 9/11. Two summers ago Amy and I spent 60 days together in Europe (half in France / half in Tuscany) just living (no travel). Last summer we spent 90 days at our house in Keystone. It’s clear I had a taste of this, but nothing like where I am right now.
Even though it has only been seven weeks, when I look forward to the rest of 2013 I feel huge amounts of open space and time in front of me. I know this has helped me come out of the depression, which I just wrote about in an article in Inc. Magazine, that I struggled with for the first part of this year.
But it’s more profound than that. In a few short months, I’ve changed my work pattern a lot. I feel so much more rested and alert. When I’m doing something, I’m in the moment. The companies I’m an investor in are all over the place, but I feel like they are actually getting more of my attention because I’m not being torn in a zillion different directions.
I don’t feel like I’m constantly trying to jam in the “work” around all the friction time – in airports, in taxis and cars being driven to things, before I head out to yet another dinner on the road, or late in my hotel before I go to sleep. My environment is familiar and comfortable and things just flow.
I’m mastering video conferencing – I’ve now got every configuration a human could need. I figured out three big things that solve for 99% of the strangeness of it.
I’ve also started using my Mezzanine video conferencing system extensively – it’s just incredible. More on that in a separate post.
I love Boulder and I’m finding myself running a lot again. It’s hard to run as much as I’d like when I’m on the road – early morning meetings, fatigue, and being in random places gets in the way. But here, I just put on my shoes and head out the door for one of my favorite trails. With or without Brooks the wonder dog.
On that note, I think I’ll go for a run right now.