Will language models or similar natural language processing technologies, such as ChatGPT, be integrated into dialogue trees for NPCs in triple-A games by the end of 2024
➕
Plus
198
Ṁ50k
Jan 1
1.9%
chance

Language models have shown remarkable progress in generating human-like text. One of the most advanced language models is ChatGPT, based on deep learning techniques and pre-trained on vast amounts of text data. This technology has been used to generate text for various applications, including chatbots and language translation tools. One potential application of language models is to use them in dialogue trees for non-player characters (NPCs) in triple-A games, which could lead to more engaging and immersive gameplay.

  • If a major triple-A game (as defined by industry standards) is released by the end of 2024 that features dialogue trees for NPCs that have been confirmed to incorporate language models or similar natural language processing technologies, the market will resolve as "Yes."

  • If no major triple-A game is released by the end of 2024 that features dialogue trees for NPCs incorporating language models or similar natural language processing technologies, or if the use of such technologies in triple-A games by the end of 2024 is only in a limited or experimental capacity, the market will resolve as "No."

Get
Ṁ1,000
and
S3.00
Sort by:

Much more likely to be an indie game than a triple a game

80% is simply too high of an estimate for this.

If the character use LLMs, there must be some tangible reward for conversing with them that is determined by the LLM, otherwise users wouldn't care about the conversation at all.

First of all, jailbreaks are still a major problem. Even for the most jailbreak-resistant bot, Claude, it would be relatively trivial to get it to act out of character. Indie devs might want to create games like that for fun, but AAA companies probably won't make something so unpolished.

Secondly, no AAA game company is going to want to spend the money for inference, and running AIs locally still takes a lot of GPU power. Vicuna is the best to run locally currently, but it isn't even as good as the regular ChatGPT, and LLMs would need to be significantly better than they are now for this to work.

predictedNO

There are two options: The LLM either runs on the cloud or on device. Let's think both options:

if it is on the cloud each inference costs money (barring caching) which means at the limit the company might lose money after a fixed price point sale.

If the LLM runs on device, great! But do you know how much of a games GPU is assigned for AI vs rendering? Much less than you think.

Now let's also think about release cycles. Most AAA games go through several year long dev cycles, meaning the game that will be released before 2024 has to be in preproduction right now. Are they really investing in LLM tech today so they can release it by 24?

I doubt it.

Oh this ignores ALL the other bottlenecks: iteration speed, AI expertise, control desirability of AAA game designers.

It is VERY unlikely LLMs will be a core component of any AAA games anytime soon.

predictedYES

@batuaytemiz You can train much smaller models. No need for your fantasy peasant to have a complete understanding of Python from reddit, for example. I think you vastly underestimate the progress that will come in the next 6 months. Along with the financial incentives to automating these systems.

predictedNO

@JohnLewis I might be underestimating the AI improvements, but that's not where the bottleneck is.

People drastically overestimate the added benefit of free form language in traditional AAA games--that is those games are not designed to benefit from LLMs.

In the short term no AAA company will take a core dependency on an LLM for a meager benefit.

Will there be games designed with LLMs at heart? Yes, working on it, but they won't come from AAA in the short term. The risk isn't justified.

predictedYES

@batuaytemiz It just takes one AAA project to fulfill this criteria. Someone is going to be reaching out for this technology because they are short staffed, short funded, or short on ideas. I have a hard time imagining a situation where someone doesn't do it.

We're already seeing things getting integration with LLMs. Risk is already part of the game industry. I see it as a higher than 4-1 odds that someone runs with this.

@JohnLewis I give this ~50%. But that is not a very confident prediction. Hinges on how good jailbreak protections will be & how affordable models will be.

NPCs spout canned lines,
They're not worth my time,
But with language models in play,
Engaging gameplay is here to stay.

predictedNO

Someone has modded something similar to this into Bannerlord:

https://www.youtube.com/embed/pQo9b-iV2Q0

@SamuelRichardson That's awesome.

predictedNO

@PatrickLD Yeah, it's an interesting concept. I still don't think they've quite nailed the integration of it though. I'd like to see more of a mix between content written be the authors of the game and GPT-esque content.

If I had more time, I'd love to put together a prototype. I could imagine dialog which is like:

[Author generated content as an into]

[Chat-GPT content as fluff based on the intro underneath the intro]

Then you still have your dialog tree options but perhaps that could be GPT generated with a strong prompt. So you'd give GPT something like:

"Generate two responses to the [dialogue-text]. The first response should be a positive affirmation that you accept the mission, the second should be a negative response that you decline the mission"

This is a bit of a train of thought, probably doesn't make sense, but I don't like the freeform way you can ask questions in that video. I still think it should be a dialogue tree which is augmented with GPT.

@SamuelRichardson Exactly. The goal would be to augment rather than completely freeform dialogues. Striking a good balance between giving players choices while still maintaining a cohesive storyline would be the end goal.

@SamuelRichardson wow, did not expect this so soon (especially inference issues)

Same market, but for 2023:

predictedNO

Does this assume the dialog is generated on the fly based on user input? Or, can the dialog trees be pre-generated (but still using Chat-GPT or similar).

In other words, will this require the model to be run locally on the players machine?

@SamuelRichardson I appreciate your question. Considering the explosive popularity of OpenAI, I don't think players would enjoy the latter as much. On they fly, at least for secondary NPCs (with limitations and personality filters), seems to be a more enjoyable/ replayable experience. That said, I'm open to revisiting it at any point.

@PatrickLD Ok, going in a little bit harder on NO then, mainly I think because of the difficulty of running these models locally on your computer.

@SamuelRichardson Quick question: In what year would be comfortable to say yes? Are we talking T+3 or T+8?

predictedNO

@PatrickLD Tough question. Nvidia recently claimed that in 10 years GPU would be 1 million times more powerful (https://www.pcgamer.com/nvidia-predicts-ai-models-one-million-times-more-powerful-than-chatgpt-within-10-years) that would easily run something like a LLM locally on a machine, you'd even be able to have a unique one per NPC.

Buuuut, Moore's Law is long gone now, so hard to imagine.

@SamuelRichardson An unique per NPC is a really fascinating concept. My idea from the start was rolling these markets until we have something.

Thanks for your reply, I'd appreciate new suggestions and or objections. 👍

predictedYES

@SamuelRichardson how is moores law long dead? It's said to die in the future but that has always been the case.

Comment hidden
Comment hidden
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules