Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
Plus
37
Ṁ17512026
53%
chance
1D
1W
1M
ALL
The total power consumption could be estimated to be around 50-60 million kWh for training GPT-4.
1/10th of this energy = 5-6 million kWh
1/100th of this energy = 0.5-0.6 million kWh
See calculations below:
Related
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
If such a model is trained on synthetic data generated with a precursor model, does this take into account the energy used to train the precursor + run inference on it to produce the synthetic data?
Related questions
Related questions
Which next-gen frontier LLMs will be released before GPT-5? (2025)
Will any LLM outrank GPT-4 by 150 Elo in LMSYS chatbot arena before 2025?
6% chance
Will there be an open source LLM as good as GPT4 by the end of 2024?
68% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/10th the energy consumed to train GPT-4, by 2026?
85% chance
Will there be an open source LLM as good as GPT4 by June 2024?
14% chance
Will an open-source LLM beat or match GPT-4 by the end of 2024?
85% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
72% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
an LLM as capable as GPT-4 runs on one 4090 by March 2025
31% chance
an LLM as capable as GPT-4 runs on one 3090 by March 2025
30% chance