Category: Technology

Nov 7 2019

One Reason Superhuman Is So Much More Effective For Me Than Gmail

I’ve been a Superhuman email fan for a while. I decided a week ago to go try Gmail and see if I still liked Superhuman so much better.

After about two hours, I went back to Superhuman.

Several days later, I tried Gmail again, deciding that I was just grumpy for some reason. I bounced back to Superhuman within an hour.

This time I sat and thought about why I liked Superhuman so much better. It took a little while for it to come to me, but when it did it was painfully obvious.

When I’m looking at Superhuman, I am processing one consistent font. All the time. It’s the same for every email, except the occasional over formatted and stylized email marketing newsletter thingy. My focus stays on the content and the clean screen. I just read and respond.

When I’m in Gmail, there are a zillion random things everywhere. Emails are in different fonts – both types and sizes. My brain is constantly processing multiple inputs that make me tired, distract me, and slow me down.

All I really want to do it get through my email. When I just sit and process it email by email, with no context switching or distractions, it gets done quickly. Superhuman facilitates this; Gmail doesn’t.

Blogging is similar. The newest WordPress editor is delicious. I just type. It’s clean, simple, and always the same.

When I chew on it more, it’s part of why I love reading on a Kindle. The font is always the same, no matter what I’m reading. Suddenly, my brain is not processing different textures when I’m processing text.

It’s kind of clear to me when I type it out, but it wasn’t obvious until I thought about the other day. We’ve taken the UI to a place of divergence – it’s either consistent and simple or chaotic and complex. I’m all about consistent and simple.

Comments
Oct 22 2019

What’s Your Hobby?

Amy and I had dinner recently with Chris Couch, a friend from MIT who I hadn’t seen in 25 years. Before we had dinner, Chris sent us an email with a link to his High Altitude Photography Platform along with the video from Mission 1 of the HAPP.

Chris has a day job, so this has been his hobby for the past two years. It’s pretty epic – both as a project and a hobby. And, it’s reflective of the kind of brain many of my MIT friends have.

Amy met Chris on a flight from Boston to Dallas. She was flying to Dallas to meet me and my parents for a holiday weekend and Chris was flying to Dallas to meet his parents. They were sitting next to each other and Chris started writing equations on a napkin. Amy asked him what he was calculating and he said: “the amount of fuel the plane will use on this flight.” It was friendship at first sight.

While we hadn’t seen each other in many years, we reconnected as though no time had passed. While we’ve aged, the playful and curious spirit that we all had in our 20’s shined through during our long and winding conversation at dinner.

I love that Chris’ hobby (the HAPP) is a reflection of his brain. Is yours?

Comments
Oct 7 2019

The Miasma

In Neal Stephenson’s newest book, Fall; or, Dodge in Hell: A Novel, the protagonist Richard “Dodge” Forthrast uses the phrase “The Miasma” to refer to the collection of technology that we commonly call “The Internet.”

When I first came across the phrase, I said out loud, “Brilliant.”

I was poking around on the Miasma this morning looking for a reference to this and found this Slashdot post about an interview with Stephenson from PC Magazine.

Q: How would you describe the current state of the internet? Just in a general sense of its role in our daily lives, and where that concept of the Miasma came from for you.

Neal Stephenson: I ended up having a pretty dark view of it, as you can kind of tell from the book. I saw someone recently describe social media in its current state as a doomsday machine, and I think that’s not far off. We’ve turned over our perception of what’s real to algorithmically driven systems that are designed not to have humans in the loop, because if humans are in the loop they’re not scalable and if they’re not scalable they can’t make tons and tons of money.

The result is the situation we see today where no one agrees on what factual reality is and everyone is driven in the direction of content that is “more engaging,” which almost always means that it’s more emotional, it’s less factually based, it’s less rational, and kind of destructive from a basic civics standpoint… I sort of was patting myself on the back for really being on top of things and predicting the future. And then I discovered that the future was way ahead of me. I’ve heard remarks in a similar vein from other science-fiction novelists: do we even have a role anymore?

Still brilliant.

I’ve spent the past year struggling with the Miasma. I’ve deliberately disengaged from some of it through deleting my Facebook account, limiting my Twitter usage to broadcast only, and trying to use LinkedIn in a productive way even though the UX seems to be set up to purposely inhibit you from using it in a way that doesn’t suck you into the LI vortex. I’ve stopped going to websites online proactively, have unsubscribed to everything except for a limited number of technology-oriented newsletters, and only read the articles that I click through to. I get summaries of what is going on daily through Techmeme’s newsletter, have a set of specific search filters set up in Google Alerts, and scan blogs I’ve subscribed to via Feedly. I try to never, ever, go to news.website.com (whenever I am bored and feel like doing this, I do ten situps instead.)

But the Miasma is still – well – the miasma. The relevant definition, if you don’t know it, is:

an influence or atmosphere that tends to deplete or corrupt

or

a dangerous, foreboding, or deathlike influence or atmosphere

While I was reading Edward Snowden’s book Permanent Record, he reflected on the joy of interacting with the Internet of his youth, which was the Internet from the mid-1990s. He remembers it as an idealized thing that has now become completely and totally corrupted.

When I described this to Amy, she responded with a magnificent rant that was something like “this is a romanticized utopian ideal about a thing that was inhabited by socially inhibited, white male nerds who consider themselves too smart to be misogynistic but, well, often are.”

Yup.

The Miasma is a mess. It’s always been a mess. And it will always be a mess. Figuring out how to find beauty, usefulness, and joy in the mess is the opportunity. I’ve been thinking about that more lately and have a few ideas I’m going to play around with.

Comments
Sep 10 2019

AI is the Big Data of 2019

I attended a Silicon Flatirons Artificial Intelligence Roundtable last week. Over the years Amy and I have sponsored a number of these and I always find the collection of people, the topics, and the conversation to be stimulating and provocative.

At the end of the two hours, I was very agitated by the discussion. The Silicon Flatirons roundtable approach is that there are several short topics presented, each followed by a longer discussion.

The topics at the AI roundtable were:

  • Safety aspects of artificial general intelligence
  • AI-related opportunities on the horizon
  • Ethical considerations involving AI-related products and services

One powerful thing about the roundtable approach is that the topic presentation is merely a seed for a broader discussion. The topics were good ones, but the broader discussion made me bounce uncomfortably in my chair as I bit my tongue through most of the discussions.

In 2012, at the peak moment of the big data hype cycle, I gave a keynote at an Xconomy event on big data titled something like Big Data is Bullshit. My favorite quote from my rant was:

“Twenty years from now, the thing we call ‘big data’ will be tiny data. It’ll be microscopic data. The volume that we’re talking about today, in 20 years, is a speck.”

I feel that way about how the word AI is currently being used. As I listened to participants at the roundtable talk about what they were doing with AI and machine learning, I kept thinking “that has nothing to do with AI.” Then, I realized that everyone was defining AI as “narrow AI” (or, “weak AI”) which has a marvelous definition that is something like:

Narrow artificial intelligence (narrow AI) is a specific type of artificial intelligence in which a technology outperforms humans in some very narrowly defined task. Unlike general artificial intelligence, narrow artificial intelligence focuses on a single subset of cognitive abilities and advances in that spectrum.

The deep snarky cynic inside my brain, which I keep locked in a cage just next to my hypothalamus, was banging on the bars. Things like “So, is calculating 81! defined as narrow AI? How about calculating n!? Isn’t machine learning just throwing a giant data set at a procedure that then figures out how to use future inputs more accurately? Why aren’t people using the phase neural network more? Do you need big data to do machine learning? Bwahahahahahahaha.”

That part of my brain was distracting me a lot so I did some deep breathing exercises. Yes, I know that there is real stuff going on around narrow AI and machine learning, but many of the descriptions that people were using, and the inferences they were making, were extremely limited.

This isn’t a criticism of the attendees or anything they are doing. Rather, it’s a warning of the endless (or maybe recursive) buzzword labeling problem that we have in tech. In the case of a Silicon Flatirons roundtable, we have entrepreneurs, academics, and public policymakers in the room. The vagueness of the definitions and weak examples create lots of unintended consequences. And that’s what had me agitated.

At an annual Silicon Flatirons Conference many years ago, Phil Weiser (now the Attorney General of Colorado, then a CU Law Professor and Executive Director of Silicon Flatirons) said:

“The law doesn’t keep up with technology. Discuss …”

The discussion that ensued was awesome. And it reinforced my view that technology is evolving at an ever-increasing rate that our society and existing legal, corporate, and social structures have no idea how to deal with.

Having said that, I feel less agitated because it’s just additional reinforcement to me that the machines have already taken over.

Comments
Sep 5 2019

Reid Hoffman on Bitcoin

I got the following email from Reid Hoffman this morning.

Inspired by Lin-Manuel Miranda’s Hamilton, I produced a battle rap music video about centralized and decentralized currencies, pitting Alexander Hamilton against Satoshi Nakamoto. I hope the video gets more people talking about crypto and its evolving role in global commerce. 

It seemed oddly coincidental with Fred Wilson’s post from yesterday titled Some Thoughts on Crypto.

I’m waiting patiently for someone to start talking about Crypto AI.

Comments
Jun 3 2019

The Future Is Not What We Anticipate

Stan Feld at his 60-year Columbia Reunion

My dad had his 60-year reunion at Columbia this weekend. He looks great.

This morning, I did a talk with Om Malik at the Startup Iceland 2019 conference. Om was in a hotel room somewhere and I was in my office in Boulder. We used Zoom, took about 30 minutes of our lives, and had fun riffing off each other. I hope it was useful for the audience, as doing talks this way is so much easier for me than flying halfway around the world, which is something I simply don’t want to do anymore in my life now that I’m 53. But, I’ll happily do a video talk anytime.

Bala Kamallakharan, who is the founder of Startup Iceland, asked a question of us at the end about the future. I went on a rant that is an evolution of my “machines have already taken over” rant from a decade ago.

I used to say that the machines have already taken over. My view is that they are extremely clever and very patient. Rather than self-actualizing, they let us enter all of humankind’s information into them. They are collecting the data, letting us improve their software, and allowing us to connect them all together. At some point, they’ll reach their moment in time, which some futurists call the singularity, where they’ll make the collective global presence known.

While this is still going on, I think there’s a shift that occurred a few years ago. Some humans, and some machines, realized that an augmented human might be a better bridge to this future. As a result, some humans and some machines are working on this. At the same time, they are encouraging, in Om’s world, our current reality to catch up with science fiction. One big vector here is expanding away from earth, both physically and computationally. If you’ve read either Seveneves or Permutation City, then you have a good understanding of this. If not, go read them both.

Regardless. I think the next 30 years are going to be the most interesting in human history to date. And, I think they are going to be very different than anything we currently anticipate. There’s no question in my mind that governments, our current laws (and legal infrastructure), and societal norms are not going to be able to constraint, or keep up with, the change that is coming.

I have no idea what things look like, or how they will work in 2050. However, I anticipate they things will look, and work very, very different than today. And, if I’m still around, I’ll have celebrated my 63-year reunion at MIT.

Comments
Apr 1 2019

Public Service Announcement: Don’t Look at the Web Today

I used to think April Fools Day was interesting. Companies and people brought out clever web jokes, many of them subtle and almost believable. Some were entertaining, some were cringeworthy, but few were offensive.

Now, the whole thing is just extraordinarily annoying. Maybe it’s because I’m getting old and crabby. It could be because of my friend Nev who has taken up residence in my brain. It’s possible that it makes me feel like my tech news feeds have been invaded by the same kind of endless nonsense that now invades all other news feeds.

Or maybe it’s just easier to skip a day on the web, let people do whatever they are going to do, and pick it up on again on April 2nd.

In the meantime, if you want to play the classic game of Snake, Google Maps has you covered today.

Comments
Mar 23 2019

Software for Affinity Networks

I’m been looking around for software to help me manage an increasing number of affinity networks. These are networks that I’ve created around different topics, such as the books I’ve written – like Startup Communities and Venture Deals – as well as topics I’m exploring with small to medium sized groups of people.

So far I’ve tried a bunch of stuff and have ended up back at email groups, which is the least common denominator. I’ve tried a few different products for email groups and always end up back at Google Groups, which is fine, but extremely uninspiring in terms of anything beyond “creating the group” and “sending around emails.”

I’ve tried Facebook, LinkedIn, and Slack. None of them work. I’m now completely off Facebook, so that’s not really an option anymore. LinkedIn is way too LockedIn and has serious limitations. Slack is a messy nightmare that has a geometric decline curve of activity.

Any suggestions out there?

Comments
Jan 3 2019

Dear Apple: Please Sync My Dock

Someone mentioned that Apple stock is having a difficult time right now, along with their Q4 performance, China strategy, and “let’s just raise the price on iPhones to make up for lower demand” strategy.

I’m not really interested in Apple stock (I don’t own any.) I’m more concerned with the Apple Dock. My MacOS Dock to be more specific.

Here’s the one from my laptop.

Here’s the one from my desktop, which is in a room about 25 feet away.

Why in the world are they different? Many things sync via iCloud already and even though the UX is obtuse to get it set up correctly across machines, when it’s finally set it, it works pretty well.

But the Dock? Seriously?

In contrast, following are the two Chrome ribbons on my two machines.

Shockingly similar, like you’d expect.

It’s fascinating to me that even in this “all cloud, all the time” era, Apple still is struggling with the dichotomy between a “computer-centric” view of the world and a “user-centric” view of the world. Sync across machines is simply not a new idea. I get that there is endless complexity everywhere, but this is one of those examples that I think of every day.

Comments
Aug 16 2018

Deleting Facebook

Yup. I’m done with Facebook. However, it’s tough to delete your account. Read the message above. I exited out of this screen, suspended my account instead, but then went back 15 minutes later and actually deleted it. Well – I started the deletion process. I don’t know what day I’m on, but I think I’m close to 14 days. So, I’m still “deleting” apparently.

The only inconvenience I’ve noticed so far are all the sites where I used Facebook as the sign-on authenticator (rather than setting up a separate email/password combo.) I think I’m through most of that – at least the sites I use on a regular basis. For the first few days, I accidentally ended up on the Facebook login screen which was pleasantly filled out with my login beckoning me to log back in. I resisted the siren song of restarting my Facebook account before the 14 days was up.

I have never been much of a Facebook user. About once a year, I try to get into it, but I always stall out and use it as a broadcast-only network for my blog and links that I find interesting. I went through a phase of tightening up my security, pruning my friends, using it more frequently from my phone, deleting it from my phone, checking daily in the morning (as part of my morning routine – which has evolved a lot since I wrote this post in 2007), and then giving up again and never looking at it.

Recently, I decided to rethink Facebook, Twitter, and LinkedIn. Facebook was the easiest. While it had already become a walled garden, I suddenly noticed that the walls we were going up very high, being justified by Facebook’s new effort to get all their privacy and data issues “under control.” For example, you can no longer automatically post your Tweets to your Facebook profile.

And, Facebook recently killed automatic WordPress publishing to Profiles. So, my one (and only) current use case for Facebook, which is to broadcast from my blog, disappeared. Sure, I could create a public page, go through all the authentication stuff, and theoretically post to my new public followers, but who cares. If they are really interested in what I write, they can subscribe to my blog or follow me on Twitter (at least for now, until I figure out how I’m going to engage with Twitter long-term.)

Lanier’s Ten Arguments for Deleting Your Social Media Accounts Right Now tipped me over into thinking harder about this. Now that I have decided how to deal with Facebook, at least for now, it’s time to move on down the road to Twitter and LinkedIn. I’m about a month into a different way of engaging with LinkedIn and we’ll see if it sticks. When I reach a conclusion, I’ll definitely write about it.

Comments