Scoffing at AGI isn’t Intellectually Honest Anymore
In 2025, it is no longer intellectually honest to completely shun the idea of artificial general intelligence (AGI) or AGI risk. Yet still, in Dec 2024 (the time of this…
Read the Dan Faggella latest coverage on
AI use-cases and trends in AI sector.
In 2025, it is no longer intellectually honest to completely shun the idea of artificial general intelligence (AGI) or AGI risk. Yet still, in Dec 2024 (the time of this…
From Emerson’s Self Reliance: This one fact the world hates, that the soul becomes; for that forever degrades the past, turns all riches to poverty, all reputation to a shame,…
If life is a flame that we presume to have started some 2.5B years ago, with a feeble simmering in some volcanic pool somewhere, then we might think about all…
Discussions around AGI and technological progress often hinge upon near-term policy decisions. Okay, so we want this or that near-term decision on governance. More privacy. Less regulation. Whatever. But all…
The year is 20,000,000 BC. In a jungle somewhere in Pangaea, a green clearing on the side of a great sloping mountain is made entirely brown. Not by mud or…
Ask a politician, businessperson, or friendly neighbor what their hopes are for the future and the odds are, they’ll tell you they hope that the world is a better place….
[If that title comes across as shocking or offensive, bear with me. Before you make assumptions about my opinions on post-human morality and moral stratification, please read the article.] I…
The following quote is as good an introduction to this article as I could ask for: “…he saw in Java a plain far as the eye could reach entirely covered…
The AI value alignment argument goes something like this: As artificial intelligence will continue to approach human-level intelligence Artificial intelligence will generally be driven by a reward function, by a…
What matters most? If we have a reasonable chance of building conscious AI and/or post-human intelligence in the next 60 years, it makes sense for us to consider where we’re…