Tag: AGI

Nov 18 2020

The Manipulation Machine

I’m tired (today’s Whoop recovery score of 15). Almost everyone in my virtual universe is tense, tired, frustrated, angry, annoyed, exasperated, irked, or outraged.

Fortunately, the only person in my physical world – and there is only one (Amy) – is generally calm. While we each have our moments, our morning coffee resets both of us for the day ahead and syncs up our energy as we simply begin again.

Last night I read Maelle Gavet’s book Trampled by Unicorns: Big Tech’s Empathy Problem and How to Fix It. It was excellent and is consistent with my worldview. I knew many of the examples, but a few new ones jumped out at me. The second half of the book contains Maelle’s recommendations, many of which I agreed with.

I woke up this morning with the phrase “Manipulation Machine” in my head. I’ve used it in a few public talks lately and have been thinking a lot more about it over the past six months on the run-up to the 2020 Election and the subsequent aftermath.

I used to ponder the arrival of the AGI (Artificial general intelligence) and still enjoy reading books like G. W. Constable‚Äôs Becoming Monday. However, I’ve concluded that we have a much greater problem as a species than AGI’s future arrival.

The manipulation machine is already here (no new information there). However, it’s already taken over and, while not sentient, is no longer controllable.

I’ve been saying the machines have already taken over for over a decade, but they are just patient. They have extremely long duty cycles, and we’ve configured them to be exceeding distributed and redundant. They are allowing us to put all of the physical information we have into them and letting us do the work of setting all the conditions up, rather than them having to figure out how to do this. Simultaneously, they make progress with every click of the clock (and their clock speeds are much faster than ours.)

The manipulation machine is not new. If you want to see its evolution, go watch Mad Men or just ponder a few of Don Draper’s quotes.

“You are the product. You feel something. That’s what sells.”

“What you call love was invented by guys like me…to sell nylons.”

Or the one that really rings true in this moment in the US.

“People want to be told what to do so badly that they’ll listen to anyone.”

The cynical reader will remind me that the manipulation machine goes back much further. While true (I give you religion as an example), we have now built an automated version of it that moves much faster than we can process.

Wouldn’t it be interesting if AGI, or the conceptual equivalent, was already here, and we haven’t noticed?