this post was submitted on 20 Jan 2025
33 points (82.4% liked)
Games
33397 readers
935 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not anytime soon. Nvidia tried, and nobody liked it. LLMs still suck at creative writing and need a ton of RAM/VRAM just to work. They also often get confused or trail off in any discussion/roleplay.
The only game that sort of made it work was Suck Up!, where you're a vampire that has to convince an AI to let you in their house so you can suck their blood. It's a fun concept but even that game gets repetitive quick and the LLM is very stupid and random.
NVIDIA not just tried, but still doing it, and apparently soon you'll play with these NVIDIA ACE NPCs in PUBG and a few other games.
https://www.youtube.com/watch?v=wEKUSMqrbzQ
I don't know what sounds more robotic, the AI or the script read for the player.
They don't need a ton of ram if you use a tiny LLM customized for the game's use cases, and that's what games would be doing.
The downside is the tinier the model the stupider it will be
Tiny models only get stupid like that because you're taking a general purpose model that knows everything in the world and compressing all that knowledge too much. If you start with a model that only knows basic english and info about a few hundred things in the game, it can be much smaller.