Brad Feld

Tag: HCI

Jason and I were at an Oblong board meeting last week and spent the entire day at the company. It’s grown a lot over the past few months and it was fun to spend time with a number of folks we hadn’t met before. The first Oblong baby was born while we were all eating lunch which resulted in lots of good cheer, karma, and the revelation from another member of the Oblong team that his wife recently found out that she was pregnant.

But the best part was playing with a bunch of the new cool shit that Oblong is working on. It’s one thing to look at what Oblong is building (as in the TED Video below); it’s a whole different experience to actually get your hands on it. Fortunately they are driving hard toward that and we expect a Q3 product release that will start bringing Oblong’s g-speak spatial operating environment to the masses.

In the mean time, if you are interested in a job helping reinvent the graphical user interface at one of the most creative and technologically challenging startup that I’ve had the pleasure of working with, Oblong is hiring. They are specifically looking for a senior programming team lead, an application programmer, a javascript / html stud or studette, and a QA lead, but if you are excellent at what you do on the software development side, I’m sure they’d love to talk to you. You can email me or send a note to jobs@oblong.com.


In the mid 1990’s I used an email client that did a pretty good job of “threading conversations.”   The UI was kind of crummy, but it did some interesting things.  It was called Lotus Notes.  I also invested in a company called NetGenesis that made the first threaded web discussion software based on a construct that had been deeply implemented in BBS’s and Notes; in fact, we referred to it as “bringing Lotus Notes like threaded discussion functionality to the web.”  That product, net.Thread, was acquired by another company I was an investor in (eShare) which went on to be have a very successful acquisition by a public company called Melita.  I have no idea where net.Thread ended up but as a master-emailer I’ve always wondered why the very simple concept of a threaded conversation never became a standard part of the email UI.

Suddenly, it’s everywhere.  It started being talked about a few years ago when it threaded conversations appeared as a core feature of Gmail.  A conversation view existed in Outlook 2007 but it sucked. When I upgraded to Outlook 2010 I was pleasantly surprised that the conversation view was excellent, although it was bizarre to me that it wasn’t the default view.

On Saturday when I started my month of a diet of only Apple products, I immediately found conversations in Mac Mail.  It’s implemented perfectly.  Then, when I upgraded my iPhone to iOS 4 voila, conversations again!

Within a year, a UI construct that has been bouncing around for 15 years but never really crossed over into the mainstream took hold.  And it makes email much better to deal with, especially if you are part of an organization (or group of people) that have a heavy “reply-all” culture.

Ironically, it’s a pretty simple feature conceptually, but the UI implementation makes all the world of difference.  I can’t figure out if the Gmail implementation set the baseline that everyone is now copying or if email conversations just entered into the zeitgeist.  Regardless, it’s an interesting example of how a simple construct can lay dormant for a long time and then suddenly be everywhere.

I only hope someone doesn’t get a patent on this next year.  That would just be stupid.


“In five years when you buy a computer you’ll get this.” John Underkoffler, Oblong’s Chief Scientist, at 14:20 in the video.

I’ve been friends with John Underkoffler since 1984 and we’ve been investors in Oblong since 2007.  Ever since I first met John I knew that he was an amazing thinker.  John, his co-founders at Oblong, and the team they have assembled are creating the future of user interfaces.  This year has started off incredibly fast for them – they’ve spent the last five months scaling the business as the result of several large customers and are in the home stretch of releasing their first “shrink wrapped product” in Q3.  Get ready – the future is closer than you imagine.


I do not want to tangle with an army of 10,000 of these.  Especially ones that have lots of sharp pokey electrocution things built in to their foreheads.

I wonder what my golden retriever would think of these dudes.  Now, what would have really been sweet is if I had one of these when I was 10 and could put it in my brother’s bedroom at night.  Bwahahahahahahahahaha.


Following is an outstanding 30 minute presentation by Jesse Schell at DICE 2010 explaining how our life is just one big game. 

Points everywhere, followed by an optimistic call to use this to make us better.


I talk about human computer interaction (HCI) a lot on this blog.  We’ve invested in a number of companies in our HCI theme, including Oblong, Organic Motion, and EmSense and have a few more that we are working on that hopefully will be announced shortly.  When I think about the areas I’ve been paying the most attention to and am the most intrigued with as an investor, HCI rises to the top of the list. 

This morning I read an article on SeattlePI titled UW researchers look to reinvent the graphical user interfaceWhile the headline is a bit sensational, the project (Prefab) is very cool.  At first glance I thought it was simply rewriting HTML pages (clever, but not that big a deal) but then I realized it was doing something more profound.  The five minute video is worth a look if you are into these types of things.

The bubble cursor and sticky icon examples are great ones.  Starting at 1:45 you see the bubble cursor and sticky icons in action on Firefox in Vista.  At 2:05 you see it on OSX.  At 2:45 you see it in action on a Youtube player.  The magic seems to be around pixel level mapping, which anyone working in adtech knows that’s where the real action is.  It’s pretty cool to see it being used to map UI functionality.


I’ve been hinting about a new conference that we’ve been working on with Eric Norlin that complements Defrag and Glue.  Eric is about to launch it and the splash page for the Blur Conference is up.

If you are familiar with Defrag and Glue, you know they are built around two of Foundry Group’s themes (Protocol and Glue respectively).  Blur is being built around our Human Computer Interaction theme, but with a twist.  Instead of simply being able to “see cool stuff up close”, our goal with Blur will be to create an environment where you can actually use and work with this stuff.  We’ll have user-oriented demos, hackathons, and tons of crazy shit no one has ever seen before.

Plus, we’ll give away a lot of cool toys, have a ton of smart people who are working on the next generation of HCI in one place, and have some fun surprises.  And we are doing it in an environment that is especially tuned for a conference like Blur.

I’m incredibly excited about what Eric has put together for this year’s Glue Conference (as I wrote about the other day).  He’s setting a high bar for Blur, where the goal will now be to have a few brains explode!  Get ready – it’s never dull around here.


If you are a long time reader of this blog, you know that I’m a huge believer that the way we interact with computers in 20 years will be radically different than how we interact with them today.  I’ve put my money where my mouth is as Foundry Group has invested in a number of companies around human computer interaction, including Oblong.

For the past few years, every time someone talks about next generation user interfaces, a reference to the movie Minority Report pops up.  Sometimes the writer gets this right and links it back to John Underkoffler, the co-founder of Oblong, but many times they don’t.  Today the NY Times got it right in their article You, Too, Can Soon Be Like Tom Cruise in ‘Minority Report’.

John Underkoffler, who helped create the gesture-based computer interface imagined in the film “Minority Report,” has brought that technology to real life. He gave a demonstration at the TED Conference in Long Beach, Calif., on Friday.

That’s a picture of John Underkoffler at Ted on Friday giving one of his jaw dropping demos of Oblong’s g-speak spatial operating environment.  Lest you think this is science fiction, I can assure you that Oblong has several major customers, is generating meaningful revenue, and is poised to enter several mainstream markets with g-speak derived products.

The company has been steadily building momentum over the past few years since we invested.  The TechCrunch article The iPad Is Step 1 In The Future Of Computing. This Is Step 2 (Or 3) gives you a little of the history.  More of the history is at Oblong’s post origins: arriving here that go back to 1994.  I personally have stories going back to 1984 when I first met John, but we’ll save those for another day.

While there is an amazing amount of interesting stuff suddenly going on around HCI (and we have invested in a few other companies around this), Oblong is shipping step 2 and about to ship step 3 while most are working on step 1.  As John likes to say, “the old model of one human, one machine, one mouse, one screen is passe.”


I weigh 209.4 this morning.  That’s down from 220 when I Declared A Jihad on My Weight on 10/27/08 although it doesn’t look like I’ll make my Anti-Charity goal of 200 by 1/31/09 (more on that in a post on 2/1/09).

I was thinking about my weight this morning as I entered it into the online system at GoWear.  I thought about it again when I entered it into Gyminee.  And then into Daytum. I’m going for a run in a little while so I’ll enter it again into TrainingPeaks

Here’s what I’m doing:

  1. Go to the appropriate web site.
  2. Choose the appropriate place to enter the data.
  3. Type 209.4 and press return.

Four times.  Every day.  Pretty ridiculous.  If you reduce the data to its core elements, they are:

  1. Web site id [GoWear, Gyminee, Daytum, TrainingPeaks]
  2. User Id (almost always bfeld)
  3. Timestamp (or two fields – date, time) – automatically generated by my computer
  4. Weight

The only actual piece of data that I need to measure is weight.  I measure this by standing on a scale each morning.  The scale is a fancy one – it cost about $100, looks pretty, and has a bunch of extra things I don’t look at such as BMI.  I have an identical scale in my house in Keystone (although the battery is dead and needs to be replaced.)

Some day, in the future, I’ll be able to step on the scale.  And that will be it.  My weight will automatically go into whatever online systems I want it to.  I won’t have to do anything else. 

Of course, one of the assumptions is that my scale(s) are “network compatible”.  While you may joke that this is the age old “connect my refrigerator to the Internet problem” (and it is), I think it’s finally time for this to happen.  As broadband and wifi become increasing ubiquitous and inexpensive, there is no reason that any electronic devices shouldn’t be IP connected, in the same way that microprocessors are now everywhere and pretty much everything has a clock in it (even if some of them still blink 12:00.)

So, accept this assumption.  Then, I’m really only taking about a “Brad-centric” data payload.  While I’ll have a lot more data than simply weight that I might want in my payload, let’s start with the simple case (weight).  Right now, we are living in a system-centric world where data is linked first to a system and then a user.  Specifically, you have to operate in the context of the system to create data for a user.

Why not flip this?  Make things user-centric.  I enter my data (or a machine / device collects my data.)  I can configure my data inputs to feed data into “my data store” (which should live on the web / in the cloud).  Then, systems can grab data from my data store automatically.  All I have to do is “wire them up” which is a UI problem that – if someone is forward thinking enough – could also be solved with a single horizontal system that everyone adopts.

Right now there is a huge amount of activity around the inverse of this problem – taking widely diffuse data and re-aggregating it around a single user id.  This deals with today’s current reality of how data is generated (system-centric) but doesn’t feel sustainable to me as user-generated data continues to proliferate geometrically.

Enough.  As I said in my tweet earlier today, “thinking about data.  thinking about running.  thinking about thinking.”  Time to go run and generate some more data.


This year at Sundance, Oblong unveiled Tamper.  The Tamper application is a gestural interface for cinematic design.  It is built on Oblong’s g-speak spatial operating environment and is a fun example of how Oblong’s core technology can be applied to a film editing system.

Tamper is part of the New Frontier on Main exhibit located at 333 Main Street on the lower level.  Oblong has set up a channel on YouTube to show some of the various videos that folks at Sundance are making with Tamper. 

I love working with these guys – they are mind-bendingly creative.