Green Eggs and Ham – Facing Future Technology and AI Like an Adult
If you were a child at any point in the last 50 years, you’re probably familiar with the story. Sam I Am tries to get a protagonist (let’s call him…
We can basically map business survival logic to civilizational survival logic, and in this article I’ll argue that we should.
In both the case of stewarding financial resources and moral value itself, we must:
When I say “value” here, I tend to imply (a) consciousness (i.e. the presence of qualia, sentience), and (b) autopoiesis (i.e. the continual expansion of all powers that help keep life alive, or potentia). This is just my definition of moral value, you but this business analogy can apply just as well to your definition of value.
We might visualize this analogy this way:
Faced with any given situation in business, responsible owners might ask: What will generate the most profit long term?
Similarly, humanity, faced with massive impending changes from AGI, neurotechnologies, and other changes, must ask: What will generate the most value or “good” long-term?
To make this general “Profit = Moral Value” analogy stick, I’ll explore a few more concrete examples, each of which symbolize an important principle of “stewardship” – both for money and for moral value:
1. Carefully Understand and Expand Value (Food truck analogy)
2. Embrace Change When Necessary (Kodak analogy)
3. Take Extreme Risks Only When Needed (Tanning salon analogy)
Let’s dive into the first example:
Food Truck Analogy:
Imagine you’re a struggling entrepreneur running a food truck who discovers that selling cheeseburgers is the first thing that truly works.
Finally, your nearly-starving family can eat because you’ve found something – at last – of value to customers. A product that works.
The rational course of action is to continue refining and scaling what works, not to immediately pivot into unrelated experiments like lasagna or sushi. Make sure the cheeseburger profit keeps coming in, make sure bills are paid and bankruptcy is avoided – only then consider leaping into other products or services.
If the food business makes enough money you might go from food trucks to physical restaurants. With the profit from those businesses you might even get into real estate or other ventures.
But you’re not making those leaps into new endeavors without having (a) consistent profit from your cash cows, and (b) a diligent process for determining which new endeavors to enter.
Lesson for Humanity:
It would be reckless to build AGI before we understand if it has the kinds of moral value we hope to preserve.
It makes sense to (a) understand deeply what we define as “value”, and (b) ensure as best we can that whatever new substrates we’re building intelligences into (AGI) actually have said valuable qualities before we grant such an entity vastly more power than human beings.
In the meantime, though augmenting human minds comes along with many of its own dangers, it is at least a pathway that preserves consciousness and some of the creative and autopoietic (potentia-expanding) powers that we would hope to see in any intelligence charged with stewarding life itself forward.
To maintain the analogy of the flame and torch (read more here), we ought ensure that a new torch can actually catch flame before we risk our current torch going out entirely while passing the flame.
Just as for a business it is best at every moment to do what makes the most sense for future profit, for humanity it is best at every moment to do what is best for the future of value itself.
Mistake for Humanity to Avoid:
The following would be mistakes:
Why?
Because they all involve leaping to a new vessel or medium of value, without understanding the value-related impacts of said choices. One misstep and there may be no going back – a new more powerful entity may run the world – one which may carry none of the elements of value we would have hoped to populate the multiverse.
It makes sense to study the new substrates or types of minds we’re creating to at least ensure that they’ll be the right carriers of value into the future – just as we want to ensure that a second torch is lit before we let the flame go out on the first one.
Kodak Analogy:
Imagine you ran Kodak in the year 2000.
In the face of digital photography, you could cling to traditional photography, investing heavily in revitalizing its stores, packaging, and marketing – all in an effort to resist the changing tide of digital. This approach didn’t just fail – it led to the company’s effective tragic downfall. The refusal to embrace the future, even reluctantly, proved absolutely fatal. No more profit for Kodak, ever. Lights out.
The lesson is clear: When transformation is unavoidable, the rational path is not denial, but thoughtful adaptation. Soothing, comforting false futures (i.e. one where digital photography never catches on) aren’t just a waste of time, they’re a distraction from the hard work of surviving, keeping profit going.
We must ask not how to preserve the old form, but how to preserve and even elevate the values that mattered within it – what made it meaningful. If paper photos were beautiful and emotionally resonant, how can those qualities be carried forward into a digital world we can’t stop?
That’s the mindset Kodak should have adopted, and what any forward-looking thinker or organization must do when faced with unstoppable change.
Lesson for Humanity:
It is completely unwise for humanity to aim for an eternal hominid kingdom – a world controlled forever by humans-as-they-are (in 2025). A “stasis” future isn’t possible, and it behooves us to ask openly what we are turning into and how we should turn into it – rather than spinning dangerous fairy tales about impossible “human forever” futures.
We must accept that change will come to our minds and our physical form, and explore those possible expansions discerningly to determine how to steward life and value itself.
Just as for a business it is best at every moment to do what makes the most sense for future profit, for humanity it is best at every moment to do what is best for the future of value itself.
Mistake for Humanity to Avoid:
The following would be a mistake:
Marriage to any specific product or market is – ultimately – scorn for profit itself. Marriage to one product is fruitless.
Marriage to any specific form for intelligence or vessel / substrate for moral value is – ultimately – scorn for value itself. Marriage to the human form for its own sake would be a mistake.
Marriage to the torch is scorn for the flame:
This particular error has been explored in my essay Eternal Hominid Kingdom, and in the article Baton, where I argue that long-term, humanity has only 4 viable endgames:
Tanning Salon Analogy:
The year is 2009. You run a tanning salon in a quiet suburb, a 30 minute distance from two relatively large metro areas.
The building you rent is shabby, and the location isn’t particularly easy to find, but rent is reasonable and through a combination of Yellow Pages ads and SEO you are slightly profitable for over a year.
2010 arrives and The Affordable Care Act imposes a 10% federal excise tax on indoor tanning services. It’ll be nearly impossible to feed yourself without that profit, and your tanning beds are, overnight, now more of a liability than an asset.
You also get a notice from your landlord that the town is closing down your building because it doesn’t meet code and is deemed dangerous.
Kodak had the option to adjust over time to new business conditions. You have no such time – your current business is over in a matter of weeks.
So, what do you do?
You need to keep profit coming in, you know that’s the goal.
You will have to move to a new location and open a totally different kind of business with different products or services. You determine that your tanning beds will almost certainly be sold for almost nothing, or given away.
You assess your skill set that’s most likely to be valuable in the market. Here’s the inventory you end up with:
With very little starting capital, you decide to spend your remaining financial runway on a business that can capitalize on a known need in the market that might be captured with two of your profitable skills. You decide to sell SEO and online marketing services, including retainer SEO services, content creation, and email marketing services.
Lesson for Humanity:
If the quarterback is going to get tackled on the fourth down, with seconds on the clock and trailing by 5 points, he may throw a Hail Mary pass. It’s not something he would do otherwise, but when it is the best of all realistic alternatives to win the game, he does it.
If a business owner is faced with the total ruin of their current business, they may hurl their resources into another related industry and make the best use of their viable skills. It’s not something they’d do otherwise, but when it is the best of all realistic alternatives to maintain profit, they does it.
If humanity is faced with near-certain ruin, we may hurl our efforts to creating posthuman intelligences (AGIs) that we hope will be conscious and have the qualities of moral value that we wish not to be snuffed out of the universe. It’s not something we’d do otherwise, but when it is the best of all realistic alternatives to maintain value in the universe, we’d do it.
Just as for a business it is best at every moment to do what makes the most sense for future profit, for humanity it is best at every moment to do what is best for the future of value itself.
Mistake for Humanity to Avoid:
The following would be a mistake:
If in the near future war, or idiocracy, or theocracy, or population decline (or whatever else!) were to loom very large in potentially squashing civilization entirely – we should give more thought to what kinds of “Hail Mary passes” we can throw – which might include mind uploading, brain-computer interface, and AGI.
It would make sense in such a dire and existential position to push harder and faster towards other channels which might carry value beyond humanity.
If the flame is definitely about to go out on your first torch anyway, it makes sense to find whatever might catch up fire and give it a valiant attempt to catch.
(Note: Some people might argue we’re in such a situation already [“the dangers are already so great, why not rush to build AGI now!”] but at the time of this writing [April 2025] I don’t think our other existential threats are strong enough to warrant reckless AGI development yet.)
I get it, the analogy isn’t perfect:
Despite these differences, the mandate that we should consider how to “steward the expansion of moral value as we would profit” is important.
To repeat what I opened this essay with:
We can basically map business survival logic to civilizational survival logic, and in this article I’ll argue that we should.
In both the case of stewarding financial resources and moral value itself, we must:
In order to steward value forward as individuals and as a global civilization, I hope for more of the following:
Entrepreneurs can play pretty recklessly sometimes, if one business fails they can sometimes just get a job, or start another one. Bankruptcy is not fun, but it is far from a permanent financial death.
In the natural world, if one species goes extinct, others will evolve and grow beyond them and the flame of life continues.
But with AGI, we’re likely not blessed with as many chances. So before we make any moves that might extinguish the torch of humanity, we should make sure (or, as sure as we can) that the trajectory we set out on is one that blooms with vastly more value than our present world – even (especially) when humans are no longer there to experience it.
If you were a child at any point in the last 50 years, you’re probably familiar with the story. Sam I Am tries to get a protagonist (let’s call him…
Episode 1 of The Trajectory is with none other than the Turing Award winner, MILA Scientific Director, ML godfather, and (very recently) AI safety advocate, Yoshua Bengio. My first interview…
If you claim to be fighting for positive AGI outcomes and you’re calling Altman “selfish” or “dangerous” or “evil,” you’re part of the problem. Here’s the TL;DR on this article:…
If you must kill me, reader, do it without malice or distain. I’m just a man hurled into a world without any inherent meaning, born to die – like you….
I don’t take my ideas all that seriously. Or yours, frankly, though I hope to learn from them. You and I both are morons and neurons. Morons: We each individually…
The 5 Stages of Grief Model isn’t perfect – but it also seems to overlap well with the experiences of grieving people. The psychological model itself needn’t be perfect –…