Will Eliezer's "If Artificial General Intelligence has an okay outcome, what will be the reason?" market resolve N/A?
➕
Plus
15
Ṁ552
2200
29%
chance

/EliezerYudkowsky/if-artificial-general-intelligence

(Note that this is about the multiple-choice market where Eliezer chose the answers, not the free response one where anyone can add their own.)

Get
Ṁ1,000
and
S3.00
Sort by:

It makes sense to resolve the market N/A and to replace it with a multi-binary market.

this is basically market doxxing. FUD has that kind of effect (e.g. signalling, bank runs, competent traders acting first, etc)

The base rate for complex long term markets resolving N/A seems high. If it resolves at all in the next twenty or so years I would expect that resolution to be N/A overwhelmingly so this feels like an easy bet. "Yes" plausibly pays out way earlier so even if you feel like "No" is more likely, opportunity cost makes it less comparatively less valuable.

predicts YES

Why is this in "meta markets on improper resolution" group? There are many ways it could properly resolve N/A.

@MartinRandall The description doesn't include any mention of an N/A resolution, and I think it implies that one of the given options will be the resolution.

predicts YES

@IsaacKing huh. What would it resolve to if we don't have an okay outcome? The answer that "could have worked" in the creator's judgement? Generally I assume that "if X" markets resolve n/a if not X.

predicts NO

@MartinRandall Oh that's true. I suppose there are some outcomes that are not okay yet still result in the market resolving. But I ascribe those extremely low probability.

predicts YES

@IsaacKing Can you explain a little about why?

predicts NO

@MartinRandall To my understanding, Eliezer's definition of "okay" includes anything better than everybody being dead or tortured for eternity. So for that market's condition to not be met requires some contrived scenario wherein human quality of life is much worse than it is now, yet Manifold Markets still exists and people care about earning mana.

predicts YES

@IsaacKing For a resolution we have to get > 20% of max attainable value. So if we all decide that AI is too dangerous and don't convert the lightcone to hedonism, it resolves N/A.

predicts YES

@IsaacKing

An outcome is "okay" if it gets at least 20% of the maximum attainable cosmopolitan value that could've been attained by a positive Singularity

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules