
AI safety community successfully advocates for a global AI development slowdown by December 2027
6
300Ṁ3422027
22%
chance
1D
1W
1M
ALL
This market resolves to YES if, by December 31, 2027, the AI safety community successfully advocates for and achieves a significant global slowdown in frontier AI development.
A "significant global slowdown" requires at least two of the following:
A formal international agreement signed by at least 3 of the top 5 AI-producing countries to limit or pause certain types of AI development
Major AI labs (at least 3 of the top 5) publicly committing to and implementing significant voluntary slowdowns
Implementation of substantial regulatory barriers to rapid AI development in at least 3 major AI-producing countries
The slowdown must be explicitly connected to AI safety concerns and must represent a material change from the previous development pace.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
Will Xi Jinping publicly call for a global halt in AI progress before 2028?
15% chance
Will there be a global "pause" on cutting-edge AI research due to government regulation by 2025?
1% chance
Will the US government enact legislation before 2026 that substantially slows US AI progress?
18% chance
Will any world leader call for a global AI pause by EOY 2027?
88% chance
Will any AI researchers be killed by someone explicitly trying to slow AI capabilities by end of 2028?
27% chance
Will any developed country establish a limit on compute for AI training by 2026?
21% chance
Will the US regulate AI development by end of 2025?
34% chance
I make a contribution to AI safety that is endorsed by at least one high profile AI alignment researcher by the end of 2026
59% chance
AI Safety Clock at 16 minutes to midnight by October 2025?
41% chance
Will there be a military operation to slow down AI development by the end of 2030?
15% chance