A Partial Inquiry on Fulfillment Beyond Humanity
As humans, learning often feels good, food often tastes good, novelty brings joy to life, living by values that we set brings order to our consciousness, and besides very few…
The Political Singularity is a hypothetical event where global and national politics centers almost exclusively on issues related to post-human intelligence and power.
Most of the attention will likely center around a single set of related questions:
Will we decide to create post-human intelligence?
Who will control these technologies / get to determine their use (which leaders, companies, nations, etc)?
Essentially, when post-human intelligence becomes remotely viable or within view, it will be the preeminent concern, chiefly because the transition to the singularity will be either the best or worst things to happen in the known universe (from a utilitarian perspective).
Once technology reaches a specific threshold of capability, people of the world will be downright “spooked” by it and enamored with it – and nothing else in the political realm (will we build a new playground for the elementary school, who is pro or anti immigration, etc) will matter almost at all by comparison.
I created this image back in 2018 when Political Singularity was first published, so this is extremely dated, but I’ll leave it here anyway:
In 2011, studying cognitive science during my Masters at UPENN, I saw research involving the remote control of moths and beetled, by controlling their brains with electric signals. That same vein of research has continued in the years afterward, allowing for relatively dextrous control over beetle movement:
For most of you reading this, watching the video above isn’t enough to send you into a panic – but there is a threshold that would send you into a panic.
Who knows what will be the straw that breaks the camels back (i.e. the event that makes the entire developed world turn to post-human intelligence as the preeminent political concern), maybe:
The transition to a Political Singularity might be gradual, but my guess that a series of pivotal events will bring it about.
It might simply be a jarring or troubling scientific breakthrough (like what I’ve listed above), but I think it’s more likely that it will be a Pearl Harbor-like tragic event, though hopefully on a very small scale (I’ve written much more on the nature of AI Disasters here).
Either way, once it happens, life won’t be returning to “normal.” The remainder of the human condition – which likely won’t be long – will probably involve wrestling with these issues continuously until we attenuate entirely or some of us turn into some kind of posthuman entities.
The prospect of becoming a subservient species to a higher intelligence is an ultimate threat – worth doing anything to prevent. The prospect of becoming or creating God is an ultimate opportunity – worth doing anything to fight to achieve. This will drive up the efforts for substrate dominance, and control of / development of artificial intelligence.
It’s easy to see how easily this would turn into a conflict between the nations or groups that want to create – or do not want to create – post-human intelligence. Hugo de Garis’ book “The Artilect War” painted a picture of pro-AGI and anti-AGI human conflict back in 2005. It’s my belief that this prediction will one day be seen as prophetic.
Writing this in 2018, it sure would be nice to see people fighting about something other than Donald Trump’s latest tweet, immigration, or whether gender is biological or merely a social construct.
On the other hand, it’s no relief to be at the precipice of passing the baton of species dominance to something beyond ourselves. A catch 22, I suppose.
Header image credit: sh.wikipedia.com
As humans, learning often feels good, food often tastes good, novelty brings joy to life, living by values that we set brings order to our consciousness, and besides very few…
In the very first part of this 2007 article, Nick Bostram of the Future of Humanity Institute at Oxford writes: Traditionally, the future of humanity has been a topic for…
I’ve gotten my hands on a new copy of Human Enhancement, edited by Julian Avulescu and Nick Bostrom, and the first article I chose to delve into was titled: “Enhancements…
I’ve been diving into “Human Enhancement” (Oxford) as of late, and came across a topic that was bristling in my own mind before I saw the name of the chapter….
(NOTE: This article was first drafted in 2012, and is among my oldest. While I still agree with some of the ideas mentioned here are still interesting to me, my…
Ray Kurzweil’s The Singularity is Near peaked my interest when he posited his reasoning for why there is likely no intelligent life elsewhere in the universe. By a mere matter of…