![](/_next/image?url=https%3A%2F%2Fstorage.googleapis.com%2Fmantic-markets.appspot.com%2Fcontract-images%2FTossup%2F3kissqvmj5.jpg&w=3840&q=75)
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
Basic
4
Ṁ4032028
36%
chance
1D
1W
1M
ALL
In "Situational Awareness: The Decade Ahead", Leopold Aschenbrenner claims:
Another way of thinking about it is that given inference fleets
in 2027, we should be able to generate an entire internet’s worth of
tokens, every single day.
Resolves YES if by the end of 2027, there is enough deployed inference capacity to generate 30 trillion tokens in a 24-hour period using a combination of frontier models. "Frontier models" in the sense that GPT-4 is a frontier model today in mid-2024.
This is one of a series of markets on claims made in Leopold Aschenbrenner's Situational Awareness report(s):
Other markets about Leopold's predictions:
Get Ṁ600 play money
Related questions
Will a new lab create a top-performing AI frontier model before 2028?
55% chance
AI: Will someone train a $1B model by 2025?
66% chance
By March 14, 2025, will there be an AI model with over 10 trillion parameters?
66% chance
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
57% chance
An AI model with 100 trillion parameters exists by the end of 2025?
26% chance
10GW AI training run before 2029?
50% chance
Will a OpenAI model have over 500k token capacity by the end of 2024.
91% chance
AI: Will someone train a $1T model by 2080?
62% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
26% chance