an LLM as capable as GPT-4 runs on one 4090 by March 2025
Basic
12
Ṁ7452025
28%
chance
1D
1W
1M
ALL
e.g. Winograde >= 87.5%
Get Ṁ600 play money
Sort by:
does it count that I can run the llm while also using cpu ram offloading just like ollama does automatically? (it would be very slow, but would work)
Related questions
an LLM as capable as GPT-4 runs on one 3090 by March 2025
29% chance
Will there be an open source LLM as good as GPT4 by June 2024?
18% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
33% chance
Which next-gen frontier LLMs will be released before GPT-5? (2025)
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
48% chance
Will it be possible to run an LLM of GPT-4 (or higher) capability on a portable device by 2027?
47% chance
Will there be an open source LLM as good as GPT4 by the end of 2024?
84% chance
Will xAI develop a more capable LLM than GPT-5 by 2026
27% chance
Will an open-source LLM beat or match GPT-4 by the end of 2024?
81% chance
There will be an open source LLM approximately as good or better than GPT4 before 2025
90% chance