From this comment:
"I would currently bet that within 2-3 years we will see a system that gets GPT-4 performance and compute-efficiency whose source-code is extremely simple and does not require a lot of clever hacks, but whose difference from GPT-3 will be best characterized by "0 to 2 concrete insights that improved things", since that is exactly what we've seen with GPT-2 and GPT-3."
This market resolves to YES if such a system exists on or before January 1, 2026 (even if it is only known to have existed long afterwards).
If a system has GPT-4 performance and compute efficiency, but the arcitecture / source code is not public, that resolves to a no, by default. However, note that the market resolves in 2035, which gives some additional time for the source code of GPT-4 and similar to become public.
Yeah, I agree that most of the uncertainty here comes from "will it be open-sourced".
In my comment I was referring to a broader definition of "seeing" that includes "I talk to OpenAI engineers and they confirm this for me, or like, vaguely hint at it being true because of confidentiality concerns".
@NoaNabeshima If it costs $100M (ie w/o algorithmic efficiency) I don't think code will get open sourced but I'm not sure
@NoaNabeshima It's possible Meta would do this. It's possible a govt funded open source effort would do this.
@NoaNabeshima You might expect 4-8x algorithmic efficiency improvements in that time, so getting it down to $12M is not crazy.
@NoaNabeshima It's possible Nvidia would do this (open source code, I'm not imagining they'd open source a GPT-4 model although it's possible)
-- redacted --