What the Worthy Successor Is and Is Not

Since the publication of the Worthy Successor essay in 2023 I’ve been glad to see the enthusiasm around exploring posthuman futures from people in tech and policy (especially on Twitter, but also at AI and AGI events).

But I’ve also been a little disappointed in what people (who haven’t read the article, but hear the term) think the term means. Often, people assume that Worthy Successor implies a mad, lawless, reckless race to building AGI just to get rid of humans with something vaguely more “powerful.”

You can’t possibly read the original article and think that that’s what I mean, but on Twitter, few people have time to read an article, which is why I try to communicate my ideas in small tables and graphics.

So I’ll do the same with what the Worthy Successor is and is not:

To add some detail to the table below, with related links to essays that more clearly flesh out the points:

  • There is no predisposition to speedily racing to AGI in the Worthy Successor. There is a belief, instead, that such transitions to posthuman entities is likely to be inevitable, but that such transitions should be remarkably carefully approached to ensure that they (a) preserve and expand current value, and (b) are capable of opening up new realms of value (potentia). This almost certainly requires international coordination and governance of AGI-related tech. [Related resources: Potentia, The Business of Value Itself, Potestas]
  • There is no attempt to eternally define worthy now and forever. There is merely an emphasis on doing our best to define and optimize for it, and to then allow for an unraveling magazine of new (potentially higher, greater) values to emerge – which may self-evidently be beyond human conception, and beyond our fettered, hominid conceptions of what worthy means. [Related resources: Axiological Cosmism]
  • There is no bend against humanity. While the Worthy Successor idea implies that humanity will attenuate and may (ultimately: should) be superseded by higher-potentia entities, humanity both (a) is the most morally valuable kind of entity we have yet discovered in the universe and so is not to be leapt beyond recklessly, and (b) seems to have some degree of volitional control over how the next permutations of intelligence emerge into the universe. While the Worthy Successor view does hold that there are hypothetically more powerful and valuable entities than humans, there is no drive to move beyond humanity without deeply understanding and knowing that we can optimize for value (via cognitive upgrades, building silicon entities, etc) first. [Related resources: The Business of Value Itself]
  • There is no bent on “breaking up the happy home” of an eternally human world by hurling AGI into the mix just to spite humanity. Rather, the Worthy Successor view is that potentia seems to bubble into higher and higher forms continuously, and in the long term humanity (like every other individual species) will have the fate totally attenuating, or turning into something else. Myriad forces (including our own motives and desires) are pulling us away from “default” unaugmented hominids already, and our role is to think seriously about what we’re turning into, and how we might avoid a collapse of earth-life in that critical process of transformation. [Related resources: Bend, not Pause, Baton]

“Worthy Successor” is just a pithy way of expressing the philosophical position of Axiological Cosmism:

  • Worthy = Axiological = “There is value now, and there are likely to be near-infinite magazines of new kinds of value unfolded by new potentia expanding into new realms of power and experience. The current value is worth preserving and optimizing for – but the future value deserves to be explored and opened up.”
  • Successor = Cosmism = “Long-term, human attenuation is eminent, and long-term, intelligent, potentia-expanding life will and should take forms wholly beyond human conception.”

..

Header image credit: dailypioneer.com