If there exists a super-intelligent AI, would majority of AI researchers answer Yes to "Have we reached AGI?" ?
Plus
24
Ṁ8362031
60%
chance
1D
1W
1M
ALL
Super-intelligent AI :
"Something along the lines of -> smarter than humans at most cognitive tasks, very very good at some key tasks, and can afford to be indifferent too anything it can't do." (@Duncn's comment)
"AI that is better than majority of the humans at most economically valuable tasks, but not necessarily better than the best humans in all of those tasks."
(I created this market to gauge opinion for @Primer's question)
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
@ShadowyZephyr Resolves whenever there is such a survey and such a superintelligent AI, until then market trades according to what that survey will point to
Related questions
Related questions
In which year will a majority of AI researchers concur that a superintelligent, fairly general AI has been realized?
Will we have an AGI as smart as a "generally educated human" by the end of 2025?
50% chance
Will artificial superintelligence exist by 2030? [resolves N/A in 2027]
40% chance
Will we have at least one more AI winter before AGI is realized?
25% chance
Will we achieve AGI by 2030? AGI meaning AI being able to do everything in computer and internet as good as human.....
42% chance
The probability of extremely good AGI outcomes eg. rapid human flourishing will be >24% in next AI experts survey
54% chance
The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
73% chance
Will AI be capable of superhuman persuasion well before (>1yr) superhuman general intelligence?
72% chance
Will AI create the first AGI?
39% chance
Conditional on their being no AI takeoff before 2030, will the majority of AI researchers believe that AI alignment is solved?
34% chance