This market is an approximate duplicate of these markets, but I am trying to do something more systematic :
- https://manifold.markets/EliezerYudkowsky/if-artificial-general-intelligence#q040uk3r5bg
- https://manifold.markets/IsaacKing/if-we-survive-general-artificial-in
- https://manifold.markets/EliezerYudkowsky/if-artificial-general-intelligence-539844cd3ba1?r=RWxpZXplcll1ZGtvd3NreQ
Related markets :
- Why don't build AGI ? https://manifold.markets/dionisos/we-dont-build-agi-before-2100-what
Good question.
I think it should go in "is not powerful enough to kill us".
That we are controlling and overseen it, being a particular reason it can't kill us.
Not "powerful enough" should be understood as "not powerful enough in the context where it is", and not "not powerful enough if it was completely free, or if we didn't become cyborgs, or…"
Good question, it would be quite hard to determine it expects in the extreme cases.
I didn't really think about how to resolve the market.
I admit it isn't great.
Let's say it will decided by what the experts think of it when the time come, and if there are disagreements between them, then it will resolve to the % of expert thinking it would be powerful enough or not.
Do you have another idea ?