Depending on how it’s done, it could make the game better or worse, just like any other tool. I imagine there will be a lot of growing pains as devs figure out what works and what doesn’t.
I could see an mmo using it for small random side quest generation where any npc could give you a quest tailored to the character. That kind of stuff would go along way to make big open worlds more “living”
Does that need an AI or just a well adjusted automated generation?
It’s the same thing. AI is not some magic pixie dust.
ML models ‘learn’ by generating non-human-readable arrays of weights, that’s a little pixie-dusty. But it’s use there is narrow, in a supporting role. My comment was about the core ‘making radiant quests feel tailored to you’ thing. It woulf still be a set of tables with fillable blanks, it’s structure and content decided by humans with a little random or maybe AI-gen content dropped here and there to add variety. Otherwise it won’t communicate the resulting quest to the system.
As a developer (not of games, but still), I would actually be interested in a tool that can generate simple code snippets for me to correct and assemble into a more complex system. But yeah, as you said, there will be growing pains as everyone figures out the optimal use cases for AI in development
Removed by mod
Your point about the screenplay reminds me of one of my biggest pet peeves with armchair commenters on AI these days.
Yeah, if you hop on ChatGPT, use the free version, and just ask it to write a story, you’re getting crap. But using that anecdotal experience to extrapolate what the SotA can do in production is a massive mistake.
Do professional writers just sit down at a computer and write out page after page into a final draft?
No. They start with a treatment, build out character arcs, write summaries of scenes, etc. Eventually they have a first draft which goes out to readers and changes are made.
To have an effective generative AI screenplay writer you need to replicate multiple stages and processes.
And you likely wouldn’t be using a chat-instruct fine tuned model, but rather individually fine tuned models for each process.
Video game writing is going to move more into writing pipelines for content generation than it is going to be writing final copy. And my guess is that most writers are going to be very happy when they see the results of what that can achieve, as they’ll be able to create storytelling experiences that are currently regarded as impossible, like where character choices really matter to outcomes and aren’t simply the illusion of choice to prevent fractalizing dialogue trees too much early on.
People are just freaking out thinking the tech is coming to replace them rather than realizing that headcounts are going to remain the same long term but with the technology enhancing their efforts they’ll be creating products beyond what they’ve even imagined.
Like, I really don’t think the average person - possibly even the average person in the industry - really has a grasp of what a game like BG3 with the same sized writing staff is going to look like with the generative AI tech available in just about 2-3 years, even if the current LLM baseline doesn’t advance at all between now and then.
A world where every NPC feels like a fleshed out dynamic individual with backstory, goals, and relationships. Where stories uniquely evolve with the player. These are things that have previously been technically impossible given resource constraints and attempts to even superficially resemble them ate up significant portions of AAA budgets (i.e. RDR2). And by the end of the next console generation, they will have become as normative as things like ray tracing or voiced lines are today.
That’s a win win all around.
Removed by mod
They largely are going to remain the same. Specific roles may shift around as specific workloads become obsolete, and you will have a handful of companies chasing quarterly returns at the cost of long term returns by trying to downsize keeping the product the same and reducing headcount.
But most labor is supply constrained not demand constrained, and the only way reduced headcounts would remain the status quo across companies is if all companies reduce headcounts without redirecting improved productivity back into the product.
You think a 7x reduction in texturing labor is going to result in the same amount of assets in game but 1/7th the billable hours?
No, that’s not where this is going. Again, a handful of large studios will try to get away with that initially, but as soon as competitors that didn’t go the downsizing route are releasing games with scene complexity and variety that puts their products to shame that’s going to bounce back.
If the market was up to executives, they’d have a single programmer re-releasing Pong for $79 a pop. But the market is not up to executives, it’s up to the people buying the products. And while AI will allow smaller development teams to produce games in line with today’s AAA scale products, tomorrow’s AAA scale products are not going to be possible with significantly reduced headcounts, as they are definitely not going to be the same scale and scope as today’s leading games.
A 10 or even 100 fold increase in worker productivity only means a similar cut in the number of workers as long as the product has hit diminishing returns on productivity investment, and if anything the current state of games development is more dependent on labor resources than ever before, so it doesn’t seem we’ve hit that inflection point or will anytime soon.
Edit: The one and only place I can foresee a significant headcount drop because of AI in game dev is QA. They’re screwed in a few years.
How do you train AI to notice bugs humans notice? Kinda seems like thats the softwares exact weakness, is creating odd edge cases that make sense for the algorithym but not to the human eye
Not really.
One of the big mistakes I see people make in trying to estimate capabilities is thinking of all in one models.
You’ll have one model that plays the game in ways that try a wider range of inputs and approaches to reach goals than what humans would produce (similar to the existing research like OpenAI training models to play Minecraft and mine diamonds off a handful of videos with input data and then a lot of YouTube videos).
Then the outputs generated by that model would be passed though another process that looks specifically for things ranging from sequence breaks to clipping. Some of those like sequence breaks aren’t even detections that need AI, and depending on just what data is generated by the ‘player’ AIs, a fair bit of other issues can be similarly detected with dumb approaches. The bugs that would be difficult for an AI to detect would be things like “I threw item A down 45 minutes ago but this NPC just had dialogue thanking me for bringing it back.” But even things like this are going to be well within the capabilities of multimodal AI within a few years as long as hardware continues to scale such that it doesn’t become cost prohibitive.
The way it’s going to start is that 3rd party companies dedicated to QA start feeding their own data and play tests into models to replicate and extend the behaviors, offering synthetic play testing as a cheap additional service to find low hanging fruit and cut down on human tester hours needed, and over time it will shift more and more towards synthetic testing.
You’ll still have human play testers around broader quality things like “is this fun” - but the QA that’s already being outsourced for bugs is going to almost certainly go the way of AI replacing humans entirely, or just nearly so.
I hear this, but then I also think of the “So… what hapenned to all the horses?” question
Their numbers went down. Drastically. That’s what hapenned. But that isn’t History when it happens to Horses.
Do you think that same result would have happened if horses had other skills outside of the specific skill set that was automated?
If horses happened to be really good at pulling carts AND really good at driving, for instance, might we not instead have even more horses than we did at the turn of the 19th century, just having shifted from pulling carts to driving them?
I’m not sure the inability of horses to adapt to changing industrialization is the best proxy for what’s going to happen to humans.
And by the end of the next console generation,
I bet we’ll have full self driving cars by then, too!
You jest, but yeah, there very likely will be, especially given that there’s already full self-driving cars today on roads. The difference will just be that in ~10 years (by the end of the next console generation) that there will be better full self-driving cars on the road.
I dare a self-driving car to drive through a bit of snow
Like this?
“Developers hate it”. No they fucking don’t. I know plenty of game devs saving a shit ton of time with AI.
Ideally AI could be used to reduce the amount of work required to produce AAA assets, and allow that time to go back into quest design and world building. Or just reduce development time so we can get great games more often.
Yeah, another tool like licensing a game engine or procedurally generated content. It will still require a lot of review and revision, custom work to overcome edge cases, and direction to meet your goals.
Yeah, it will never replace 100% of labor, but even reducing it by a bit adds up, and this could be a substantial amount.
Just like automating a agriculture, manufacturing, photography, and food production.
The biggest issue is that due to how capitalism works the reduction in labor effort means people lose out on income instead of society as a whole benefiting through being able to have more free time.
What AAA studio managers hear:
“So you mean I only need two devs now to do the work of 10? Sounds great!”
“And no, we’re not going to lower the price of our games.”
It could be interesting for procedurally generated games. Imagine a world with no fixed map, settlements where every person is completely unique and will talk to you about any subject you want to talk to them about (instead of the same canned phrase or two), a completely different roster of baddies to fight every time, maybe even the storyline itself never plays the same each time, or the style of play changes from game to game. I’m hopeful we’ll start to see some truly unique games with AI helping out, though I’m guessing we’ll get a mountain of shovelware that just uses AI to generate shitty non-sensical art assets and meaningless dialogue.
AI-generated maps and NPCs might be ok. Ditto fights, though there would have to be playtesters whose job it is to make sure the result is something winnable and acceptably fair.
The main issue there would be that there IS no continual certainty of that. You’d have to either be able to rerolled entire encounters — which would be jarring — or force the AI to DM what happens when you lose an impossible battle — far more rewarding, provided it doesn’t keep doing it. But it may keep doing it. This would be impossible to ever test adequately. Every game on the market may be a hard mode Bethesda game.
I personally really don’t think I’d enjoy something with a randomly generated cast/main story for the same reason I wouldn’t be interested in owning one singular book whose writing changes every time you read it. I don’t play to kill time; I play for the stories and I get attached like hell to the good ones. I replay them ad nauseam because I miss the characters.
I think it would be an intensely entertaining idea either as a New Game+ or for those games to have a wildcard setting that you could turn on and off. That way, there’s no lack of devs who get to tell the tale they wanted and players can mix it up when they’re bored. Otherwise, you’ve downgraded the job of the entire company to filling the AI in on background lore and nothing else.
Other aspects:
• for those that do get attached and wanna re-experience it, you’d need a way to save the information behind the game you just played. That file might be fairly gigantic?
• Would also lead to a weird market for other peoples’ saves. The way modders already make quests, but for an entire plot.
• NPCs and party members that all look like randomized sims.
Have you played/seen Vaudeville? It’s a detective game where every character had their own LLM and TTS trained for a specific personality.
It’s super janky and I never finished it because I kept getting conflicting info from characters but…it’s a really great use case for it. The massive caveat being that it requires an Internet connection.
The massive caveat being that it requires an Internet connection.
Like literally every game released in the last decade
I had two replies in my inbox. One was yours and the other was about people unnecessarily adding “literally” to their statements lol
Ive been playing games all week in offline mode. In fact I prefer it so it stops updates breaking my mods. Come at me.
How can you do that when they said LITERALLY every game?!
The wording of this is so stupid for a lot of reasons. Specifically with how ambiguous “AI” is. The ghosts in pacman are AI.
If they’re talking about generating code bases, that’s just not going to happen.
If they mean LLMs being used by programmers in code editors as a useful tool, like GitHub Copilot, then that’s awesome and an increase in productivity.
Artists can use generative AI art for quick textures like repeating grass textures. An AI will not be able to match an art style or theme, it has a limited scope and can’t be hand crafted like what’s required in game with poly budgets.
Devs obviously love better tools. These save time and increase productivity.
I think you’re on the money, but I have to disagree with one point: AI will absolutely be able to match an existing art style, if not now then very soon.
AI can be a great tool if used properly to enhance human work but companies seem hell-bent to instead have just AI do all the work, cutting human beings out completely and “saving costs”. Recipe for disaster.
I like the idea of using it to give NPC’s intelligent things to say.
“There are no countries in Africa that start with the letter K.”
Together we can stop this
Not yet.
People are a bit optimistic about how it could be used, it’s still a bit dumb. In all likelihood it’s likely to be used in asset creation since that’s one of the pricier aspects of game design, automating and replacing the more grunt work stuff. Not design so much as textures, object modeling, etc., which are already easy to do via AI (and easy to train, avoiding lawsuits by keeping things in house). That’ll displace “artists” although texture creation is a bit of a slog anyway.
Should people be worried about writers? Maybe, but I’m not-- at least not yet. AI can create filler, but it’s story writing is abysmal. You’ll still need a creative behind the curtain to build the world, subvert tropes, and so on. AI can assist but if it’s better than you on writing, you really shouldn’t be a writer.
To use an example from when ChatGPT became mainstream, a certain scifi serial magazine had to close submissions because they were bombarded with cheap and fast short story submissions. According to the editors, these stories were some of the worst they’ve ever seen. I forget the name of the magazine, but I thought it was pretty funny since I was playing with the tool and couldn’t agree more.
None the less, it’s probably for the best. I hate making assets, and my wife used to do translation and that’s really boring and under paid. A lot of game design is incredibly boring and laying off people making those things is probably in their best interest, those jobs suck. Main downside is the business class of the industry will pocket the profits instead of reinvesting in their products or reducing prices.
A lot of game design is incredibly boring and laying off people making those things is probably in their best interest
People need to pay rent. The fuck is this.
The fuck is what, the jobs in question won’t even pay rent. Translation, for instance, is contract work and pays less than minimum wage if you do it well and it’s not a job of passion. If that’s what’s keeping you afloat, your problem isn’t with the gaming industry, it’s with society itself.
Quitting that work was also the best decision my wife ever made, so fuck off with bleeding heart nonsense. Those jobs aren’t jobs society should have.
Even if you remove the jobs, it doesn’t create jobs. If you remove those jobs the people who were taking them are still there.
Where do they go when the jobs are gone? Is this just meant to force those people to change careers? Did your wife’s skillset transfer anywhere or is she still unemployed? Or did she get a new job that has nothing to do with translation?
Well putting it another way, labor market always stabilizes, and it’s only the last few decades labor didn’t raise with demand (at least in the U.S.). But inevitably people find work or create work, the speed of which could be days or it can be decades depending on a ton of factors that won’t fit in a one off explanation on Lemmy (especially given how much people don’t like hearing what I have to say, regardless of my own training in policy lol).
But to explain at least a little nuance, people in jobs with low entry requirements often do change industry and people with training or education sometimes do. Tech companies gave us a great example recently with massive layoffs. Reports are still out but it seems like many of them just found more work or made start ups. It was kind of interesting that some left tech, but it’s a high paying job and even outside of layoffs, there’s a concept that if you want a raise, you change jobs because there’s always someone looking for programmers.
My wife actually ended up in localization, which is slightly different from translation but has room to go up (which she did). Same industry. Not going to dox her, of course, but she managed to get work within a week and has a weirdly high success rate even if the industry still grossly underpays everyone (gaming is a passion field). Bilingual skill is not easy to train, so she was valuable-- she just didn’t know it when she was just doing contact based translation work.
Ugh, and there’s another long winded explanation I meant to avoid, haha. Look, I don’t worry too much about it if you’re American (or Western European). If you guys want to get upset about something, it can and will harm the jobs that were outsourced decades ago. Translation for instance is big in Eastern Europe (e.g. Romania) and automation easily removes those jobs.
Those jobs are jobs society does have even if you think it shouldn’t. You know what happens if ai takes everyone’s jobs that you think shouldn’t exist? No one has any money. No society is laying out plans for this, no one is setting up any systems to help people when this happens.
And who saves the money? Share holders. you have the problem with society, everyone else is just trying to pay rent.
Ideally, yeah, AI should be used to automate boring grunt work and enable more people to engage in something creative. Maybe those jobs in the future can transform into something like managing AI’s output and fixing unique edge cases, where human input is still required.
Yes exactly! And ideally in the short term we can minimize the damages that charges like that make. I’ve seen places where factory jobs left, and it’s not great without some intervention.
I’d love basic income but… not optimistic, but we can always dream.
Even if the dream isn’t going to be realized fully, it’s still useful to have a direction to move in