Will I lead a completed pretraining of a >=1B param language model before EOY 2024?
Basic
1
Ṁ80Jan 1
87%
chance
1D
1W
1M
ALL
Must be trained on at least 100B tokens, and start from random initialization. Distillation is okay only if it meets these requirements.
I'll cast an initial guess vote and then no longer participate further in this market
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will there be an AI language model that strongly surpasses ChatGPT and other OpenAI models before the end of 2024?
3% chance
Will a lab train a >=1e26 FLOP state space model before the end of 2025?
22% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
53% chance
Will Meta release an open source language model that outperforms GPT-4 by the end of 2024
67% chance
Will there be an LLM which can do fluent conlang translations by EOY 2024?
72% chance
Will it cost less than 100k USD to train and run a language model that outperforms GPT-3 175B on all benchmarks by the end 2024?
85% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
How big will Mistral's known largest language model be? (2024)
Will a language model that runs locally on a consumer cellphone beat GPT4 by EOY 2026?
72% chance