Will anyone train a TokenFormer model at scale before 2026?
Plus
2
Ṁ7252026
25%
chance
1D
1W
1M
ALL
Will anyone train a TokenFormer model using at least (the equivalent of) 200,000 H100-hours before 2026?
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will a OpenAI model have over 500k token capacity by the end of 2024.
50% chance
Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
39% chance
AI: Will someone train a $1B model by 2025?
67% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
22% chance
Will a new lab create a top-performing AI frontier model before 2028?
57% chance
Will a model costing >$30M be intentionally trained to be more mechanistically interpretable by end of 2027? (see desc)
57% chance
Will models be able to do the work of an AI researcher/engineer before 2027?
40% chance
Will OpenAI release a tokenizer with vocab size > 150k by end of 2024?
42% chance
Will there be a more sample-efficient pretraining algorithm than next token prediction for NLP before 2027?
43% chance
10GW AI training run before 2029?
41% chance