What will be the maximum context length of any OpenAI LLM by EOY 2024?
Basic
10
967
2025
11%
[ 0, 200 k)
26%
[ 200 k, 800 k)
26%
[800 k, 1.5 M)
20%
[ 1.5 M, 4 M)
9%
[ 4 M, 10.5 M)
9%
>= 10.5 M

Currently, GPT-4o has a context length of 128k tokens . On the other hand, Gemini offers 1M, has announced it's extending it to 2M, and has published results up to 10M.

What will be the maximum context length of any of the OpenAI models by EOY 2024?

It needs to be accessible through API or the web interface, and released to at least multiple people that aren't part of OpenAI and who can freely test and report about it. A demo doesn't count, a paper doesn't count, a release to a select group of people that can't talk about it doesn't count.

But a full public release where anyone can use it isn't strictly required either. The spirit is that we just need to have independent and clear verification (in my personal opinion) that the tech is already there, has an official name, and is probably coming soon to the public.

Get Ṁ600 play money

More related questions