Can AI truly replicate the screams of a man on fire ? Video game performers want their work protected

Can AI truly understand the nuances of human emotion? These are the questions that have plagued philosophers and scientists alike for decades. The rise of artificial intelligence (AI) has brought us closer than ever to answering these questions. While AI can mimic human behavior and even generate creative content, it still struggles with the complexities of human consciousness and emotions. For instance, consider the case of a chatbot designed to provide emotional support.

Video game performers say they fear AI could reduce or eliminate job opportunities because the technology could be used to replicate one performance into a number of other movements without their consent. That’s a concern that led the Screen Actors Guild-American Federation of Television and Radio Artists to go on strike in late July. “If motion-capture actors, video-game actors in general, only make whatever money they make that day … that can be a really slippery slope,” said Dalal, who portrayed Bode Akuna in “Star Wars Jedi: Survivor.” “Instead of being like, ‘Hey, we’re going to bring you back’ … they’re just not going to bring me back at all and not tell me at all that they’re doing this. That’s why transparency and compensation are so important to us in AI protections.”

Hollywood’s video game performers announced a work stoppage — their second in a decade — after more than 18 months of negotiations over a new interactive media agreement with game industry giants broke down over artificial intelligence protections. Members of the union have said they are not anti-AI. The performers are worried, however, the technology could provide studios with a means to displace them. Dalal said he took it personally when he heard that the video game companies negotiating with SAG-AFTRA over a new contract wanted to consider some movement work “data” and not performance. If gamers were to tally up the cut scenes they watch in a game and compare them with the hours they spend controlling characters and interacting with non-player characters, they would see that they interact with “movers’” and stunt performers’ work “way more than you interact with my work,” Dalal said.

“It’s not just about playing a game. It’s about living in that world.”

This is a quote from an interview with a game developer discussing how gaming has evolved.

These movements are then captured and used to create realistic and dynamic animations. The process of motion capture involves several steps, starting with the actor’s performance. The actor performs the desired action, and the markers on their suit are tracked by a motion capture system. This system uses cameras and sensors to capture the movement of the markers, which are then used to create a 3D model of the actor’s body.

A recent interview with a game developer highlighted the potential of generative AI in game development. The developer, who works at a studio that has motion capture data from a previous game, explained how generative AI can be used to create new characters. The developer explained that animators can use the motion capture banked from a previous game to train generative AI models.

This system is designed to incentivize employees to contribute to the company’s success.

“Voice actors may see fewer opportunities in the future, especially as game developers use AI to cut development costs and time,” the report said, noting that “big AAA prestige games like ‘The Last of Us’ and ‘God of War’ use motion capture and voice acting similarly to Hollywood.” Other games, such as “Cyberpunk 2077,” cast celebrities. Actor Ben Prendergast said that data points collected for motion capture don’t pick up the “essence” of someone’s performance as an actor. The same is true, he said, of AI-generated voices that can’t deliver the nuanced choices that go into big scenes — or smaller, strenuous efforts like screaming for 20 seconds to portray a character’s death by fire.

“The big issue is that someone, somewhere has this massive data, and I now have no control over it,” said Prendergast, who voices Fuse in the game “Apex Legends.” “Nefarious or otherwise, someone can pick up that data now and go, we need a character that’s nine feet tall, that sounds like Ben Prendergast and can fight this battle scene. And I have no idea that that’s going on until the game comes out.” Studios would be able to “get away with that,” he said, unless SAG-AFTRA can secure the AI protections they are fighting for. “It reminds me a lot of sampling in the ‘80s and ’90s and 2000s where there were a lot of people getting around sampling classic songs,” he said. “This is an art. If you don’t protect rights over their likeness, or their voice or body and walk now, then you can’t really protect humans from other endeavors.”

© Copyright The Associated Press. All rights reserved. The information contained in this news report may not be published, broadcast or otherwise distributed without the prior written authority of The Associated Press.

Leave a Reply