an LLM as capable as GPT-4 runs on one 4090 by March 2025
Plus
12
Ṁ795Mar 2
31%
chance
1D
1W
1M
ALL
e.g. Winograde >= 87.5%
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
does it count that I can run the llm while also using cpu ram offloading just like ollama does automatically? (it would be very slow, but would work)
Related questions
Related questions
Which next-gen frontier LLMs will be released before GPT-5? (2025)
an LLM as capable as GPT-4 runs on one 3090 by March 2025
30% chance
Will any LLM outrank GPT-4 by 150 Elo in LMSYS chatbot arena before 2025?
6% chance
Will there be an open source LLM as good as GPT4 by the end of 2024?
68% chance
Will there be an open source LLM as good as GPT4 by June 2024?
14% chance
Will an open-source LLM beat or match GPT-4 by the end of 2024?
85% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
Will xAI develop a more capable LLM than GPT-5 by 2026
59% chance
Will a Mamba-based LLM of GPT 3.5 quality or greater be open sourced in 2024?
79% chance
Will there be a OpenAI LLM known as GPT-4.5? by 2033
72% chance