Is attention all you need? (transformers SOTA in 2027)
Basic
121
22k
2027
61%
chance

This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)

Details can be fount at https://www.isattentionallyouneed.com/

Proposition

On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.

Other markets on the same:

Get Ṁ600 play money
Sort by:

What about hybrid models, like Jamba? They might be the best of both worlds.

predicts YES

Yes given that an architecture qualifies that levers a combination of transformer models and supporting infra components that wouldn’t be considered breakthrough technologies on their own (e.g. RAG).

So do mixtures of experts count? The linked page this not contain any actual details.

predicts YES

@EchoNolan I talked to Sasha, and his response is basically that as long as the E in the MoE is Transformer, its a transformer.

i have strong principled reasons this should stay at 50% for the next 24 hours

subsidy phasein

predicts YES

@jacksonpolack Hm, I will add in subsidy at a later point wherever the market stabilizes to maintain that