Why will "If Artificial General Intelligence has an okay outcome, what will be the reason?" resolve N/A?
Basic
8
Ṁ575
2025
5%
Too many existing humans suffer death
5%
Too many existing humans suffer other awful fates
5%
80% of currently attainable cosmopolitan value becomes unattainable
6%
The concept of "maximum attainable cosmopolitan value" is not meaningful
57%
As a demonstration of treacherous turns, trolling, or lulz
4%
Some other reason
8%
No reason given after 30 days
8%
It will not resolve N/A

This Yudkowsky market will resolve N/A.

/EliezerYudkowsky/if-artificial-general-intelligence

But can you predict why?

Resolves to the reason given by Yudkowsky.

Get
Ṁ1,000
and
S3.00
Sort by:

People are uncertain about AI doom but 75% confident that Yudkowsky will do it for the lulz.

ISTM it's likely we all die before EY gets the chance to log on to Manifold and resolve the market.

How does this market resolve if that one doesn't resolve N/A?

@IsaacKing Oh, I didn't see that one of the responses covers that.

@IsaacKing I will take a brief break from luxuriating in 20% of max attainable value to realize that I'm in an impossible thought experiment set up to test my integrity, put down my ultra-chocolate, and carefully resolve this market to the correct answer to demonstrate my counterfactual integrity to the larger universe that is simulating me, thus slightly increasing my expected returns in the larger universe. And then I'll go back to the ultra-chocolate.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules