What is the “meaning” of the Singularity?

If we look forward to the potential trajectories of intelligence and sentience, we wonder:

  • Where is the future taking us?
  • What does the future “want”?

Some people have posited theories about the meaning of the Singularity. Here are just a few that I’ve read or heard over the last few years:

  • Love is a kind of cosmic force, and as we learn to avoid conflict with ourselves and our planet, we’ll use technology to create more of that loving “energy” to populate the galaxy.
  • Humans have arrived at wonderful values like justice, equality, and liberty, and these same powerful values will continue their trajectory of “goodness” into future forms of intelligence, to expand into the universe carrying these true and just values.
  • The universe itself eager to “wake up” and become sentient – opening its metaphorical eyes to senses and thought, just as biological life has done. This idea is echoed by Emerson on a number of occasions:

“Man, made of the dust of the world, does not forget his origin; and all that is yet inanimate will one day speak and reason. Unpublished nature will have its whole secret told.” – Ralph Waldo Emerson, On the Uses of Great Men

Most of these ideas are interesting to consider, and to explore and wonder at the meaning of post-human intelligence is, I think, a necessary task. Even if the Singularity isn’t possible (and it may not be), or never happens, fleshing out the canvas of “north star” aims for humanity seems worthwhile.

If we are to stumble forward as a species, it probably behooves us to think through the potential meanings of our journey, and to flesh out more desirable paths to pursue, and less desirable paths to avoid.

For idea-generation, and for informing critical thinking, brainstorming “meaning” and interpreting the Singularity seems to be valuable.

The problem as I see it, is that often, these interpretations are taken to be truth.

Some people brainstorm possible meanings out of interest, but some people believe themselves to have accessed some kind of “true” meaning – to have a pulse on the conscious or intention of the universe itself. To actually believe their interpretation as fact.

This is folly, and I suspect it will do more harm than good in the great effort to steer the trajectory of post-human intelligence.

Anyone who has a theory of the world that “makes sense of it all” is what I call a wise cricket: A feebly limited intelligence pretending to grasp a wildly complex world that is wholly beyond their ability to imagine, nevermind comprehend. This is a kind of paramount hubris.

“We are to judge with more reverence, and with greater acknowledgment of our own ignorance and infirmity, of the infinite power of nature. How many unlikely things are there testified by people worthy of faith, which, if we cannot persuade ourselves absolutely to believe, we ought at least to leave them in suspense; for, to condemn them as impossible, is by a temerarious presumption to pretend to know the utmost bounds of possibility.” – Montaigne, Essays Book 1, Chapter XXVI

A friend shared this cartoon with me recently:

Birds, Singularity

Image credit: FalseKnees

We could read the cartoon above and interpret “Fries on the pier” is not an analogy for mindless human activity (watching reality TV, eating Doritos, etc).

Or, we could interpret “Fries on the pier” to be an analogy for the height of human thought, for the loftiest and furthest things humans can imagine.

I prefer the second interpretation.

Thinking of ultimate meaning in terms of “Fries on the pier.” This is exactly what humans do when they imagine “meaning” in the Singularity. Everything we love and value is the equivalent of french fries: arbitrary and relevant only to our own small natures.

The values that we hold highest, like “Love”, “Happiness”, “Relationships”, “Creativity”, “Exploration”, “Evolution”, are – for the most part – proxies to some evolutionary drive, or some need that correlated to survival or mating. It could be argued that these values are hardly “our own”, but are just more nuanced versions of the same drives that lemurs or orangutans have.

In addition, all human constructs are limited by the hardware and software of our monkey suit, our hominid form. Just as your dog can’t learn about the nuances of Montaigne’s essays, and your goldfish can’t appreciate the complexities of nuclear non-proliferation efforts, you (and me, and all humans) can’t possibly imagine most of what could possibly be valued. We can’t even imagine the higher, further, deeper, or more varied modes of “valuing” that a future intelligence would have.

“Whatever falls out contrary to custom we say is contrary to nature, but nothing, whatever it be, is contrary to her. Let, therefore, this universal and natural reason expel the error and astonishment that novelty brings along with it.” – Montainge, Essays Book 1, Chapter XXX

Our “natural” values, those which seem so self-evident to many humans, needn’t correlate at all with what nature itself is doing. It would behoove us to “expel the astonishment” of the idea that nature might have directions and forms and modes that differ so drastically from what we currently think and feel.

The meaning of the Singularity, as far as I can tell, is to figure out what the hell is going on. Painting our abstract, feeble constructs to it’s meaning is useless, but leveraging the Singularity to discern what “the good” is, what the universe is – that seems to be all we have.

As I’ve mused in a previous essay about the preeminence of “exploring the good“, we need to find what is good before we can do good, and acquiring this understanding of the world and our place in it (at an much more robust level, with much more robust cognitive resources from AGI or cognitive enhancements) might be as much “meaning” as we can hope for while remaining intellectually honest.

In addition, there may never be a “ground truth” to stand on, only a deeper web of forms and more ways of comprehending and making sense of them to further our aims. It’s possible that we never land on firm moral grounding, even when diety-level AGI exists and “thinks” with countless megatons of computing power and “senses” with a million-billion sensors that we can’t possibly imagine.

It’s less comfortable going to sleep with these ideas than it is to believe in an all-loving God who has ushered our dead relatives into a better place, or that we can achieve Nirvana if we simply follow the right steps.

Alas. My highest values, and yours, are but “fries on the pier.”

 

Header image credit: Disruptor Daily