Your “Dystopia” is Myopia

I’m writing to you as a friend.

Beliefs about the future are limiting your potential, and they’re working against your future wellbeing and power. I don’t want that for you.

Follow me for a minute.

Imagine traveling back in time 60 years and telling your grandmother about what your day-to-day life is like now, the year 2022: 

  • You stare at screens for 9 hours a day for work, and another 5 hours a day for relaxation.
  • You use Uber to drive in a stranger’s car, and Airbnb to sleep on a stranger’s bed.
  • You watch internet porn of myriad varieties (people, octopi, robots, many race / gender permutations, etc).
  • You work 50 hours a week yet you have never met your coworkers or customers in person – or only rarely.
  • Etc…

She’d call this life a “dystopia.”

She’d likely also think it was sacrilegious, inconceivable, and radically “impossible” for two reasons:

  1. Science will never advance so far so soon.
  2. Humans would never be so morally corrupt as to change their lives and societies that much.

But when you wake up, order Starbucks with Uber Eats and sit down to write code or answer emails or join Zoom calls for 9 hours straight, you don’t feel like this is impossible, and you don’t feel like you are morally astray from your grandmother.

If anything, your situation is much better. More convenient, more nimble. You can access what you need (friendships, productive work, information, entertainment) instantly, while your grandmother had huge limitations in her available career paths, her choice of entertainment, her choice of friendships, etc.

And we’re just getting started.

As technology advances rapidly, procedurally generated content becomes incredibly powerful, VR becomes more realistic, and all first-world humans spend the majority of their waking hours on screens—often, on screens that have their content conjured forth by AI, i.e. Netflix, social media, etc.—you begin to contemplate some of the ideas about where society is going, and what the future of the human condition might be like:

  • Human beings living in immersive VR, taking care of their physical needs only as maintenance requires (Husk).
  • Friendships being replaced by selfless, infinitely wise and helpful AI-generated “friends” who can provide richer and more robust emotional and intellectual support than any real human (programmatically generated everything).
  • Nation-states battling to control the computational substrate that houses human experience (Substrate monopoly).
  • Brain-computer interfaces altering human minds to experience permanently higher levels of wellbeing, or great creativity, memory, etc.
  • Etc…

You call these scenarios a “dystopia.”

They violate your ideas of what is sacred, and they are “impossible” for two reasons:

  1. Science will never advance so far so soon.
  2. Humans would never be so morally corrupt as to change their lives and societies that much.

Your “dystopia” is myopia. 

It’s a failure to see how fast changes to the human condition have occurred over the last 30 years, and a failure to accept that even more radical changes are soon to come.

But these changes in the human condition are merely a continuation of the same dynamic:

Humans adopt all technologies that satisfy their needs (financial, sexual, emotional, or otherwise) more quickly, more conveniently, or more completely than other available options.

Humans have never wanted almost anything that they claim to want (a new Corvette, a million dollars, a walk in the woods, a girlfriend). They want the fulfillment of particular needs and drives. Whatever technologies or processes fulfill those drives faster and better get adopted.

The statements that follow are self-evident to you, but would have seemed like sacrilege to your grandmother 60 years ago:

  • “There’s no need to drive into down and fight for parking when I can jump in a stranger’s Uber for $17.”
  • “There’s no need to go to church to meet a future spouse when I could swipe on Tinder.”
  • “There’s no need to commute to work when I can do all my work and all my meetings from a 14-inch wide piece of glass in my house.”

These statements below, or something just like them, will be self-evident to people in a few decades, but seem like sacrilege to you today:

  • “There’s no need to type on a QWERTY keyboard in an upright chair when I can recline in my immersive VR haptic suit all day and not only have richer immersive experiences, but be way more productive at work in the process.”
  • “There’s no need to go on dates with real people when I can have a perfectly personalized AI-generated ‘partner’ who never wrongs me, and supports my sexual, emotional, and intellectual needs better than any flawed and selfish human ever could.”
  • “There’s no need to meditate to feel relaxed and balanced when I can set my brain-computer interface to whatever pleasing emotional state I want.”

I don’t know which of the VR, AGI, or brain-computer interface technologies are going to land first, so I’m not going to make hard predictions here. But I’m smart enough to know some of them will land, and the changes to the human condition will not be something I can stop.

But friend, why do these ideas bother you?

I’m telling you it’s fear.

And this fear is weakness.

The future is coming at us – hard and fast – and the ostrich strategy is no longer viable.

Grandma could put her head in the sand and not open learn to use a laptop, order an Uber, or use the latest sales enablement platform, spend 10 hours a day on Zoom calls, or even shop online. She had the option to run out her remaining years without having to adjust to these “sacrilegious” changes to the human condition.

You, however, have no such choice.

You don’t get to say “I’m only comfortable with technology that is <10% different from that which I experienced in my 20s and 30s.” 

You won’t have a job. You won’t be able to be in touch with friends. You won’t be able to support the causes you care about. It’s all changing too fast now.

You accuse me:

“You’re just telling me to mindlessly adopt these new technologies?!”

“You’re saying I should give up on my values, and on the life I want to live?!”

No.

I’m telling you that if you have values you care to uphold, you don’t get to freeze them in time like the Amish. 

You have to deal with the changes to the human condition, and mold them proactively to be more of what you believe is good, and less of what you believe is bad.

I’m telling you that you can only represent your values into the future if you step into the future – with the tools, resources, and new powers that that future will contain. You cannot step out of the river, the water is rising and the pull is too strong.

I couldn’t be a good friend without telling you frankly:

If you have values that are important to you, you must imbue them into the future that’s coming at you – rather than denying or hiding from it. If the future disturbs you – face it squarely, take action, don’t count on your sense of disgust or ideas of the ‘sacred’ to save you.

i.e.:

  • Stop referring to all radically changed human conditions as “dystopic.” You live like a monster in the eyes of your great grandparents, but you don’t feel like a monster. Future generations will seem “monstrous” to you unless you embrace change.
  • Accept that Luddite escapism will not serve your aims. Assume that opting out of the next big waves of technological change would have been just as backward, ridiculous and shortsighted as your grandparents opting out from electricity and plumbing, or you opting out of the Internet.
  • Embrace the opportunity to bake the values we find important into the future. There is no hiding from what’s to come: but there is molding and influencing. Build what you want to build and live as you want to live, but plan for it to be wildly different from your experience, plan for your need to adapt radically to this new world.

 

Header image credit: Edsurge