One of the questions from https://jacyanthis.com/big-questions.
Resolves according to my judgement of whether the criteria have been met, taking into account clarifications from @JacyAnthis, who made those predictions. (The goal is that they'd feel comfortable betting to their credance in this market, so I want the resolution criteria to match their intention.)
Clarification:
Conditional on no AGI or population collapse.
Because of the likely transformative nature of AGI and superintelligence, I phrase many of my predictions about the future as conditional on no AGI so that the predictions are less dependent on AGI timeline estimates. I would estimate above 90% that technologies like cultured meat, fusion, and life extension will be developed very quickly after superintelligence. Similarly, I condition on no population collapse because I would estimate under 10% that these technologies would develop after population collapse to below, say, 1 billion humans (e.g., nuclear winter, pandemic) without decades of recovery time.