Will a LLM trained with FP4 have competitive performance in 2 years time?
13
Ṁ13812025
22%
chance
1D
1W
1M
ALL
"Currently, the technology for 4-bit training does not exists, but research looks promising and I expect the first high performance FP4 Large Language Model (LLM) with competitive predictive performance to be trained in 1-2 years time." (see: https://timdettmers.com/2023/01/16/which-gpu-for-deep-learning/)
Granted, the model must be open source for us to know, so the market will resolve based on publicly available information.
Get Ṁ1,000 play money
Sort by:
This seems important @typedfemale
Will this resolve YES if scaling laws suggest a 4-bit model would be competitive if compute-matched to a SOTA 16-bit model?
@NoaNabeshima Yes, you need to be better than everything else, but be trained in 4-bit (to some extent)
Related questions
Related questions
Will Europe be competitive in the LLM race compared to OpenAI or Google at the end of 2024?
7% chance
Will an LLM improve its own ability along some important metric well beyond the best trained LLMs before 2026?
58% chance
Will a publicly-available LLM achieve gold on IMO before 2026?
45% chance
Will LLMs mostly overcome the Reversal Curse by the end of 2025?
64% chance
Will an opensource LLM on huggingface beat an average human at the most common LLM benchmarks by July 1, 2024?
74% chance
Will an LLM be able to solve the Self-Referential Aptitude Test before 2027?
66% chance
Will a LLM beat human experts on GPQA by Jan 1, 2025?
90% chance
At EOY 2024, who will have the best LLM?
Will any LLM released in the next year double my coding productivity?
24% chance
Will any LLM outrank GPT-4 by 150 Elo in LMSYS chatbot arena before 2025?
18% chance