Will a global catastrophe kill at least 10% of humans before 2100, and what will be the cause?
➕
Plus
54
Ṁ17k
2100
60%
No global catastrophe
1.5%
Nuclear
5%
Biological
18%
Artificial Intelligence
1%
Nanotechnology
12%
Climate
1.5%
Other cause

If the human population decreases by at least 10% in any period of 5 or fewer consecutive calendar years, which we will define here as a "global catastrophe", resolves to the principal cause. Otherwise, resolves to "no global catastrophe".

See https://www.metaculus.com/questions/1493/global-population-decline-10-by-2100/ and https://www.metaculus.com/questions/2568/ragnar%25C3%25B6k-seriesresults-so-far/ for more precise definitions of these categories and for related forecasts.

If multiple global catastrophes occur, resolves based on the first.

See also:

Get
Ṁ1,000
and
S3.00
Sort by:

Two curves to consider:

1.) Production of synthetic polymers.

2.) Rate of photo-cleavage.

#2 I'd expect to be roughly linear, while #1 definitely isn't.

Approximately 100 years for this stuff to disperse across the planet and begin slowly breaking down to sizes that can enter the bloodstream.

bought Ṁ10 Other cause YES

Microplastics.

bought Ṁ10 Other cause YES

Does idiotic ideology count?

Buying "other" at 1% because the outside view says democide is pretty likely and escalating that to a full scale nuclear exchange is unlikely

How does the climate one resolve in the event of comorbidity where both boneheaded government policies and climate change contribute to reducing agricultural output and without either one of those people would not have starved

@JonathanRay If there is a 10% decline, it resolves to whatever the "principal cause" is. In a case like you described, generally there should be one that is more important than the other. I will resolve the same as Metaculus if possible.

Unlikely AI kills 10% of humans without finishing the job so I’m arbing

@JonathanRay You seem to have bet them to the same price, so you're saying you think it's ~0% likely? I would put it at >20%.

This site chronically overestimates the risk of AI disasters and underestimates the risk of climate disasters. I have some hypotheses as to why, but a lot of it boils down to the Techie Bubble hypothesis.

@evergreenemily I think for a disaster killing 10% of humans, non-AI risks are substantial. But climate, nuclear, etc are very unlikely to cause extinction, whereas AI is substantially more likely. The forecasts at https://www.metaculus.com/questions/2568/ragnar%25C3%25B6k-seriesresults-so-far/ align with this reasoning.

@jack Outright extinction, I agree. But runaway climate change could, in a worst-case scenario, kill 10% of humans in a relatively short period of time. In particular, climate-caused ecological/agricultural collapse followed by famines, especially in densely populated areas, could kill hundreds of millions or even billions of people.

Is it likely? No, I think it's pretty unlikely. But there's way more than a 3% chance of it happening by 2100, and that's what tne percentage was at when I checked this market this morning.

(Edit: FWIW, I think Metaculus underestimates the risks of a 10%-of-humans-die catastrophe from climate change, but is accurate in giving human extinction as a result of climate change an extremely low chance by 2100.)

@evergreenemily I think reasonable people can disagree about the probability, I personally think the ~3% predicted by Metaculus is about right. I think climate impacts are most likely to be spread out over time, not concentrated to reach 10% population loss in 5 years - it's possible but very unlikely. As https://astralcodexten.substack.com/p/the-extinction-tournament points out,

It’s unclear whether anything in recorded history would qualify; Genghis Khan’s hordes and the Black Plague each killed about 20% of the global population, but both events were spread out over a few decades.

@jack I agree that reasonable people can disagree! I'd definitely put the odds of "climate change kills 10% of humans over the course of 10-20 years" significantly higher than the odds of "climate change kills 10% of humans over the course of five years;" but I'd still put the latter at around 5-10%.

Note that this can be arbitraged with a ton of the markets in https://manifold.markets/group/ai-doom

Specifically, chance that AI wipes out humanity is < chance that AI kills 10% of humanity.

Interesting that AI is low here compared to the other options, but the AI extinction markets are much higher.

Yes, this is a pretty typical result I have observed in many other prediction contexts, and I think there are several reasons for it:

  • There's a classic https://en.wikipedia.org/wiki/Conjunction_fallacy where asking "Will AI kill X% of humans" sometimes gets a higher prediction than "Will anything kill X% of humans".

    • Prediction markets can be arbitraged to correct this fallacy, but the ROI for arbitraging a market that resolves in years or decades is so low that it's not strongly enforced by the market actors.

  • The people predicting/trading on AI x-risk are different than the people predicting/trading on non-x-risk questions. And, even when it's the same people, the questions tend to elicit different modes of thought that may result in irrational inconsistent predictions. It's particularly noteworthy that people betting YES on x-risk cannot collect their winnings on resolution, so the market incentives are fundamentally different than "normal" prediction markets.

One reason I made this multiple choice market as well as some separate binary markets was to see how different the markets would behave.

How is the fragility of industrial ag. not in the list (I'm betting 'Other')...

I don't think it falls directly under "climate" but it depends on whether you consider direct or indirect causes, either way in that scenario we will blame climate when it does contribute to gigafamines.


https://www.theguardian.com/commentisfree/2023/jul/15/food-systems-collapse-plutocrats-life-on-earth-climate-breakdown

@parhizj I think food crises that we believe stemmed from climate change - if we believe they would not have occurred in a counterfactual world with little or no global warming - would count as "Climate". But if we believe the food crises would have occurred regardless, that would count as "Other"

I think this is a reasonable definition for what qualifies as "Climate" in this question:

The question results positively if a human-made climate change or geoengineering catastrophe occurs that claims at least 10% in any period of 5 years or less before 2100. This catastrophe must be due to the direct effects of climate change that are generally believed very unlikely in a counterfactual world with little or no global warming but otherwise similar to ours. These include (but are not limited to) floods, extreme weather, the spreading of infectious disease, and the health effects of extreme heat. Finally, the effects due to the effects of the use of geoengineering that has been principally motivated to mitigate climate change risks, also count towards the population decline.

from https://www.metaculus.com/questions/1500/ragnar%25C3%25B6k-question-series-if-a-global-catastrophe-occurs-will-it-be-due-to-either-human-made-climate-change-or-geoengineering/

@jack It might be hard to differentiate lower yields from climate and how we are treating soil like dirt. (Mainly I think because there won't be a control, as regenerative agriculture is such a small minority to be useless for statistics).

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules