Reflection on John Harris’s “Enhancement Are a Moral Obligation”
I’ve gotten my hands on a new copy of Human Enhancement, edited by Julian Avulescu and Nick Bostrom, and the first article I chose to delve into was titled: “Enhancements…
There are people who imagine a future in the year 4000 which is nearly identical to 2025 but with robot butlers, travel to
These people believe, understandably, that humans do not want post-human or even trans-human transitions. The augmenting of the mind, or creation of minds vastly beyond humans seems obviously sacrilegious and something most humans will obvious resist.
But in this article I’ll argue that the opposite is true, that in fact:
The trajectory is already starting, and we must think about guiding it well now, as if it is a process in motion, rather than as if it is a process we can “push start” on in 2 or 10 or 200 years (as some suggest).
In the Intelligence Trajectory Political Matrix I discuss how different people have different kind of “end goals” that they might want to pursue. I’ll argue in this article that the “Preservation” camp is actually somewhat weak – and that any small inclination towards Progression very quickly turns into Ascension and posthumanism – and I’ll argue that we should deal with innovation and regulation as it that transhuman transition was already underway.
I’ll argue that we must focus on bending, rather than entirely pausing, the posthuman trajectory:
Below I’ll list three of the most compelling reasons that we’re moving swiftly to post humanism already (with no option to freeze “homo sapiens 1.0”).
I’ll conclude with what this implies for public policy and steering our next steps as a species.
The hedonic treadmill is the idea that an individual’s level of happiness, after rising or falling in response to positive or negative life events, ultimately tends to move back toward where it was prior to these experiences.
Winning the lottery doesn’t drive up long-term fulfillment.
People in 2024 with 3000 TV channels (and YouTube) aren’t tangibly “happier” than previous generations who just had newspapers or AM radio and little else.
Here’s how this plays out with regard to the posthuman transition:
You say:
“But Dan, most people aren’t just going to escape into some AI world of pleasure simply because the technology makes it easy to do so.”
I’d reply immediately by telling you most people are seeking escape, and that drug and pornography and video game addiction – and indeed the religious ideas of Samsara and the veil of tears (among others) are all pointers to the fact that most people want to escape the state of nature. If given a viable option, they would.
But also, pleasure is not the only pull to digital immersion. Power is also a motive.
We can think about the first batches of ravenous tech adopters who wave good-bye to the human condition as roughly fitting into two camps – explored in more detail in the full World Eaters vs Lotus Eaters article:
Those who want to be productive in the “real” world will not do so by walking around in the monkey suit, but by welding more and more of their volition through myriad virtual means. The cost of being ambitious in the era of AI-enabled work and early brain-computer interface tech will be giving up the current human condition almost entirely (full essay: Ambitious Tech).
This won’t be for “tech fanatics”, this will be a necessity for anyone who wants to be productive and relevant in the increasingly virtual future. It’s an attractor state that I would argue can’t be stopped.
It should also be noted that even the base pursuit of pleasure in AI / VR worlds will not simply be a simulation of the current human condition (i.e. being in a hot tub with Tinashe or something), the hedonic treadmill will quickly stretch into realms of experience that don’t have proxies for the natural world – into strange, personal universes of experience optimized not for fidelity to reality, but to whatever the user responds to (full essay: The 7 Phases of Posthumanism).
People aren’t in some way eternally “satisfied” (in their wellbeing, productivity, or anything else) by some specific single improvement in their personal or professional lives.
They immediately look out for the next improvement, the next advancement, the next convenience, the next saver of time or money, or the next faster mode of fulfilling their core desires.
As I mention in myYour Dystopia is Myopia article, when people think of radical changes to the human condition (such as: Humans living in VR worlds most of their waking hours, or humans replacing most of their human relationships with more fulfilling and useful AI relationships, or humans choosing to get brain-computer interfaces to allow them to understand more of the world or get more done by splitting their brain activity in multiple tasks at once) they’ll typically say:
“That would never happen, because (a) that technology is impossible, and (b) people wouldn’t never do something so unnatural!”
But we are already monstrously “unnatural” in our daily lives today compared to our ancestors just 2-3 generations ago – and our world today is full of tech that was previously considered “impossible.”
To paraphrase and riff on Planck’s Principle.
“Science progresses one funeral at a time” – or – “Tech adoption progresses one birth at a time.”
My father did a thousand “unthinkable” things in the eyes of my grandparents. He grew his hair long, he married someone who wasn’t Italian, he listened to rock and roll music.
Humans tend to think that the music that was popular during their coming of age is the greatest must ever created, and all that follows is junk.
Humans tend to think that technology introduced when they were 20-30 is “cutting edge”, and technology beyond that point is unnatural and strange.
Even without and fundamental breakthroughs or advancements in AI tech, we’re nearing the level where programmatically generated experiences will replace many of our usual ones.
Young people with exposure to the tip of the iceberg of this tech will not say “Oh, but this is unnatural, I shouldn’t do this,” any more than my father thought “Oh no, this ‘rock and roll’ is clearly a bad thing I better not listen to it.”
People will do whatever helps them achieve pleasure or power, and it new, interesting options arrive (especially if those options are already popular among some cliques of the youth), young people will jump right in. Posthumanism is on its way now, and we must bend its trajectory as the process occurs, there will be no waiting 100 years “once we get AGI right” (if such a thing is even possible).
(Ben Goertzel discusses how youth are pushing AI adoption in his interview here. Lambert Hogenaut – who leads forecasting at the United Nations – discusses how willing the youth is to adopt radical changes to the human condition in his interview here.)
I consider this to be the most damning reason that there is no pause button the posthuman transition.
The argument goes like this:
Here’s a graphic that expresses this idea:
This is explored in more depth in You Don’t Want What You Think You Want, and in Closing the Human Reward Circuit.
People do not want “real” or “natural”, they want the fulfillment of their drives.
Humans constantly ambulate between drives, and as soon as their goals (pleasure, power, or otherwise) are fulfilled more cost-effectively through something “not real” or “not natural”, they’ll adopt it.
Sacredness is not a barrier that will prevent changes in the human condition. Your personal level of discomfort is not going to stop the rest of humanity, particularly younger people, from moving ahead onto monstrous new kinds of experience.
The forces holding onto “2020s human experience” are giving way to the obliteration of the human experience at a faster pace than any other decade in human history maybe by a factor of 100.
The hedonic treadmill, the new “norms” of the youth, and human nature itself are already in the process of turning us into something beyond or other than what is human.
Given that this transition is underway already, we should be discussing real, viable futures, not “eternal hominid kingdom” fairytales.
We need to ask:
Taking our heads out of the sand means recognizing that we need to think about our direction now.
Transcendence is already well under way.
We must bend (for the better) the posthuman trajectory – rather than attempt to pause or stop it.
Enough with fairytales of eternal AI caretakers and the 2020’s lasting forever, or even homo sapiens lasting much longer. Let’s be careful that we don’t move recklessly and screw up the hand-off, but let’s talk frankly about what’s next, because the lilly pad we’re standing on now is sinking.
(Note 1: For the record, I am not against efforts like “Pause AI” in principle. It may well be that slowing things down is best, and I’ve long argued that global AGI governance in order to slow things down would be a net good. I simply believe it is false to think that completely pausing is possible. We can bend the trajectory of change, we cannot brook the waters entirely. This is the one fact the world hates – but we must face it manfully and paint possible futures amongst the uncomfortable changes that inevitably lie ahead.)
(Note 2: Getting leading thinkers to discuss what a worthy successor is, and how to move towards it, is what I’m doing in the Worthy Successor interview series on The Trajectory YouTube channel. Tune in if that’s a dialogue you want to be part of, and feel free to suggest guests. Bostrom and Bach were really fun ones.)
I’ve gotten my hands on a new copy of Human Enhancement, edited by Julian Avulescu and Nick Bostrom, and the first article I chose to delve into was titled: “Enhancements…
I’ve been diving into “Human Enhancement” (Oxford) as of late, and came across a topic that was bristling in my own mind before I saw the name of the chapter….
Ray Kurzweil’s The Singularity is Near peaked my interest when he posited his reasoning for why there is likely no intelligent life elsewhere in the universe. By a mere matter of…
Ideals Have Taken Us Here Could the yearning to improve the quality and efficiency of our daily human experience also bring us to abandon much of what we consider “human”?…
In the coming decades ahead, we’ll likely augment our minds and explore not only a different kind of “human experience”, we’ll likely explore the further reaches of sentience and intelligence…
1) Augmentation is nothing new Until recently, “augmented reality” seemed to only have a place in video games and science fiction movies. Though at the time of this writing, “AR”…