In "Situational Awareness: The Decade Ahead", Leopold Aschenbrenner predicts that the largest AI training clusters will consume 100GW of electricity in ~2030.
This market resolves YES if a training run of a single AI model consumes 100GW+ of power sustained through most of the training run. This power cost includes overhead to run the data center, such as cooling.
This is one of a series of markets on claims made in Leopold Aschenbrenner's Situational Awareness report(s):
Other markets about Leopold's predictions:
https://www.energy.gov/eere/articles/how-much-power-1-gigawatt
In 2022, all the wind turbines in the US could have powered it, if I'm reading this right. Which was 9.2% of all the power in the US that year. I can see this resolving yes in a rather standard "AI is a big deal" future timeline.