Will an AI model use more than 1e28 FLOPS in training before 2026?
5
1kṀ785Dec 31
24%
chance
1D
1W
1M
ALL
Resolution source: Epoch AI's list of notable AI models. I will check this source on January 1st, 2026, to see whether there is a model that uses more than 1e28 FLOPS https://epoch.ai/data/notable-ai-models
AI models do not only include LLMs, but also other types of AI models that are mentioned in the resolution source
As of market creation, the biggest LLM model is Grok 3, with 4.6e26 FLOPs of training compute
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
96% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^26 FLOPs
96% chance
Will the largest machine learning training run (in FLOP) as of the end of 2025 be in the United States?
89% chance
At least one of the most powerful neural nets at end of 2026 will be trained using 10^27 FLOPs
82% chance
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
72% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2026?
52% chance
Will a machine learning training run exceed 10^27 FLOP in China before 2028?
44% chance
Will there be an announcement of a model with a training compute of over 1e30 FLOPs by the end of 2025?
5% chance
Will a machine learning training run exceed 10^26 FLOP in China before 2028?
85% chance
How much FLOP will be used to train the best language model with freely available weights on July 1, 2025?