A Partial Inquiry on Fulfillment Beyond Humanity
As humans, learning often feels good, food often tastes good, novelty brings joy to life, living by values that we set brings order to our consciousness, and besides very few…
In history, we are reminded that all is in flux. America is likened to Rome. China today is likened to the Tang Dynasty many centuries ago. Europe was pure barbarity, then Europe was the leader of technology and economic development. Empires and kingdoms rise – and then at some point fall.
“They are the sowers, their sons shall be the reapers, and their sons, in the ordinary course of things, must yield the possession of the harvest to new competitors with keener eyes and stronger frames.” – Ralph Waldo Emerson, Essays Second Series: Manners
We assume that these cycles can and will continue indefinitely – with different ethnic groups and different styles of government gaining and losing prominence.
We forget that the same is so with species.
I present a hypothesis:
The advent of artificial general intelligence will make the leading technological and economic power of the 21st century into the final human kingdom. This kingdom will largely develop, release, and (at least initially) direct the trajectory of post-human intelligence that will ultimately overtake humanity.
This implies that – at some point in AGI development – national interests will commandeer AGI development efforts.
In China, “private sector” companies are already directed by the CCP, and entirely beholden to the government’s whims and goals for expansion or power. In the USA, we can’t long expect the DoD to sit back and watch as an AGI lab is clearly about to birth something more powerful than the US military. When that moment comes, or as we approach it, I suspect the tanks will roll up to OpenAI (or whoever’s) doorstep, and they keys will be handed over.
It is possible that for hundreds of years, different empires and systems of homo sapien governance will rise and fall. I suspect this will not be the case, and a great many AI researchers suspect a Singularity scenario before the 2070s (Dec 2023 note: This article was published in 2019, when 2060 AGI timelines seemed reasonable. Now, nearly everyone’s timelines are shorter).
If these are the Final Kingdoms, here’s what that might mean:
I have no crystal ball, and I can’t be sure that the Final Kingdom premonition will hold true, but for not I suspect it will.
If the Final Kingdom is a remote possibility, then I would argue the following:
My own interviews with Chinese and US AI innovators (both CEOs and researchers) – and within multilateral AI conversations at the United Nations between the US and China – leads me to believe that conflict isn’t necessarily inevitable… though I still consider it likely.
The cosmopolitain spirit is stronger than it ever has been, and we’ll need as much of it as we can in the years ahead. Maybe humans will create a kind of shared Final Kingdom together. Maybe, as Hugo de Garis has suggested, the greatest war ever fought will be fought over this issue of species dominance (read: Political Singularity).
Time will tell.
Header image credit: vision.org
As humans, learning often feels good, food often tastes good, novelty brings joy to life, living by values that we set brings order to our consciousness, and besides very few…
In the very first part of this 2007 article, Nick Bostram of the Future of Humanity Institute at Oxford writes: Traditionally, the future of humanity has been a topic for…
I’ve gotten my hands on a new copy of Human Enhancement, edited by Julian Avulescu and Nick Bostrom, and the first article I chose to delve into was titled: “Enhancements…
I’ve been diving into “Human Enhancement” (Oxford) as of late, and came across a topic that was bristling in my own mind before I saw the name of the chapter….
(NOTE: This article was first drafted in 2012, and is among my oldest. While I still agree with some of the ideas mentioned here are still interesting to me, my…
Ray Kurzweil’s The Singularity is Near peaked my interest when he posited his reasoning for why there is likely no intelligent life elsewhere in the universe. By a mere matter of…