Nick Bostrom defines existential risk as
Existential risk – One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.
The problem with talking about "doom" is that many worlds that fall to existential risk but don't involve literal extinction are treated as non-doom worlds. For example, leaving humanity the Solar System rather than a significant portion of the 4 billion galaxies reachable from the Solar System is plausibly a "non-doom" outcome, but it's solidly within Bostrom's definition of x-risk.
Thus when people are discussing P(doom), the intent is often to discuss only extreme downside outcomes, and so low P(doom) such as 20% doesn't imply that the remaining 80% don't involve permanently and drastically curtailed future of humanity. In other words, a P(doom) of 20% is perfectly compatible with P(x-risk) of 90-98%.
Ben Mann of Anthropic said in [...]
---
Outline:
(02:02) Between Extinction and Permanent Disempowerment
(03:24) Avoid Doom, Clarify Stance on Permanent Disempowerment
---
First published:
July 29th, 2025
Source:
https://www.lesswrong.com/posts/eSob8HxDbsuDhoTXt/low-p-x-risk-as-the-bailey-for-low-p-doom
---
Narrated by TYPE III AUDIO.
En liten tjänst av I'm With Friends. Finns även på engelska.