Will we see most new language models shifting to addition-only architectures like BitNet/BitNet 1.58b in 2024?
Basic
2
Ṁ35Jan 1
43%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
By the end of 2026, will we have transparency into any useful internal pattern within a Large Language Model whose semantics would have been unfamiliar to AI and cognitive science in 2006?
38% chance
Will there be an AI language model that strongly surpasses ChatGPT and other OpenAI models before the end of 2024?
3% chance
Will 'jailbreaks' in large language models be solved in principle by the end of 2024?
6% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
70% chance
Will any language model trained without large number arithmetic be able to generalize to large number arithmetic by 2026?
51% chance
Will Meta release an open source language model that outperforms GPT-4 by the end of 2024
67% chance
Will any 10 trillion+ parameter language model that follows instructions be released to the public before 2026?
53% chance
By 2025 will there be a competitive large language model with >50% of the total training data generated from a large language model?
75% chance
Will any open source LLM with <20 billion parameters outperform GPT-4 on most language benchmarks by the end of 2024?
13% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
26% chance