
What will be the median p(doom) of AI researchers after AGI is reached?
16
1kṀ10732101
84%
Above 5%
67%
Above 10%
25%
Above 20%
9%
Above 50%
5%
Above 80%
AGI defined as an AI that is better at AI research than the average human AI researcher not using AI.
p(doom) defined as human extinction or outcomes that are similarly bad.
In Katya Grace's 2022 survey, median values were 5% for "extremely bad outcome (e.g., human extinction)” and 5-10% for human extinction.
All answers which are true resolve Yes.
Related:
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
Related questions
Related questions
Will OpenAI be in the lead in the AGI race end of 2026?
54% chance
What will be the P(doom) of these individuals when Manifold thinks ASI is <1y away?
What will be the average P(doom) of AI researchers in 2025?
20% chance
when will OpenAI have announced they have achieved AGI?
ML researchers’ median probability of existential risk from AI
20
Will AGI retaliate on AI doomers in a way that makes AI doomers regret it?
3% chance
Doom if AGI by 2040?
45% chance
The probability of extremely good AGI outcomes eg. rapid human flourishing will be >24% in next AI experts survey
54% chance
The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
75% chance
Will we reach "weak AGI" by the end of 2025?
27% chance