Will transformers still be the dominant DL architecture in 2026?
Plus
23
Ṁ21492026
79%
chance
1D
1W
1M
ALL
Resolves true if I judge, based on the common opinion among deep learning researchers, that transformers remain the most popular architecture in deep learning research at the start of 2026. If the answer is not clear, resolves true if at least 50% of arXiv papers from 2025 on A.I. mention transformers, otherwise resolves false.
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
Apparently there is already a very similar market!
https://manifold.markets/LeoGao/will-transformer-based-architecture
Related questions
Related questions
Will Transformer based architectures still be SOTA for language modelling by 2026?
78% chance
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
84% chance
Will there be over 1000 Optimus robots working at Tesla before 2026?
35% chance
Will Optimus be used on the Optimus assembly line before the end of 2025?
28% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
Will there be over 10,000 Optimus robots working at Tesla before 2027?
18% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
Will a big transformer LM compose these facts without chain of thought by 2026?
64% chance
By the start of 2026, will I still think that transformers are the main architecture for tasks related to natural language processing?
68% chance
Will a big transformer LM compose these facts without chain of thought by 2026? (harder question version)
43% chance