close
close
migores1

The video game actor doesn’t want his performance to be considered just dates

For hours, motion-capturing sensors attached to Noshir Dalal’s body tracked his movements as he unleashed airstrikes, airstrikes and one-handed attacks that would later appear in a video game. Finally, he swung the sledgehammer tightly in his hand so many times that he tore a tendon in his forearm. By the end of the day, he couldn’t open his car door handle.

The physical strain involved in this type of motion work and the hours it puts in are part of the reason he believes all video game performers should be equally protected from unregulated AI use.

Video game performers say they fear AI could reduce or eliminate employment opportunities because the technology could be used to replicate a performance in a series of other moves without their consent. It’s a concern that prompted the Screen Actors Guild-American Federation of Television and Radio Artists to go on strike in late July.

“If motion capture actors, video game actors in general, are only making the money that they’re making that day … it can be a very slippery slope,” said Dalal, who played Bode Akuna in “Star Wars Jedi: Survivor”. “Instead of saying, ‘Hey, we’re going to bring you back’ … they just won’t bring me back at all and won’t tell me they’re doing it at all.” That’s why transparency and compensation are so important to us in protecting AI.”

Hollywood video game artists have announced a work stoppage – their second in a decade – after more than 18 months of negotiations over a new interactive media deal with the gaming industry giants broke down over artificial intelligence protections . Union members said they are not anti-AI. Performers are worried, but technology could give studios a means to replace them.

Dalal said he took it personally when he heard that video game companies negotiating with SAG-AFTRA for a new contract wanted to consider some “data” of motion work and not performance.

If players were to calculate the cutscenes they watch in a game and compare them to the hours they spend controlling characters and interacting with non-player characters, they would see that they are interacting with the work of the “moder” and stuntmen. more than you interact with my work,” said Dalal.

“They’re the ones that sell the world that these games live in, when you’re doing combos and doing crazy, super cool moves using the Force, or you’re playing Master Chief, or you’re Spider-Man going through town. he said.

Some actors argue that AI could deprive less experienced actors of the chance to land smaller background roles, such as non-playing characters, where they typically cut their teeth before landing bigger jobs. Uncontrolled use of AI, artists say, could also lead to ethical issues if their voices or likenesses are used to create content they morally disagree with. This kind of ethical dilemma has arisen recently with game “modding”, where fans modify and create new game content. Last year, voice actors spoke out against such changes in the role-playing game “Skyrim,” which used AI to generate actors’ performances and cloned their voices for pornographic content.

In video game motion capture, actors wear special Lycra or neoprene suits with markers on them. In addition to more involved interactions, actors perform basic movements such as walking, running, or holding an object. Animators take from those motion capture recordings and stitch them together to respond to what someone playing the game does.

“What AI allows game developers or game studios to do is to automatically generate many of these animations from previous recordings,” said Brian Smith, an assistant professor in Columbia University’s Department of Computer Science. “There’s no longer a need for studios to collect new footage for every game and every type of animation they want to create. They can also draw on their archive of past animations.”

If a studio has stored motion capture from a previous game and wants to create a new character, he said, animators could use that stored footage as training data.

“With generative AI, you can generate new data based on that previous data model,” he said.

A spokeswoman for the video game makers, Audrey Cooling, said the studios provided “significant” AI protections, but SAG-AFTRA’s bargaining committee said the studios’ definition of who constitutes a “performer” was key to understanding the issue who will be protected.

“We’ve worked hard to deliver proposals with reasonable terms that protect performers’ rights while ensuring we can continue to use the most advanced technology to create a great gaming experience for fans,” said Cooling. “We have proposed terms that provide consent and fair compensation for anyone employed under (the contract) if an AI reproduction or digital copy of their performance is used in games.”

The gaming companies offered pay increases, she said, with an initial 7 percent increase in scale rates and an additional 7.64 percent increase effective in November. This is a 14.5% increase over the life of the contract. The studios also agreed to increased per diem, overnight travel pay and an increase in overtime rates and bonus payments, she added.

“Our goal is to reach an agreement with the union that will end this strike,” Cooling said.

A 2023 report on the global gaming market from industry tracker Newzoo predicted that video games will begin to include more AI-generated voices, similar to the voice in Squanch Games’ “High on Life.” Game developers, the Amsterdam-based firm said, will use artificial intelligence to produce unique voices, bypassing the need to find voice actors.

“Voice actors may see fewer opportunities in the future, especially as game developers use artificial intelligence to reduce development costs and time,” the report said, noting that “big AAA prestige games like ‘The Last of Us ” and “God of War” use motion capture and voice acting in a Hollywood-like way.”

Other games, such as “Cyberpunk 2077,” have featured celebrities.

Actor Ben Prendergast said the data points collected for motion capture do not capture the “essence” of someone’s performance as an actor. The same is true, he said, of AI-generated voices that can’t offer the nuanced choices that go into big scenes — or smaller, tedious efforts like screaming for 20 seconds to portray a character’s death by fire .

“The big problem is that someone, somewhere has this massive data, and now I have no control over it,” said Prendergast, who voices Fuse in the game “Apex Legends.” “Bad or otherwise, someone can pick up that data now and go, we need a character who is nine feet tall, sounds like Ben Prendergast, and can fight this fight scene. And I have no idea that’s going to happen until the game comes out.”

The studios could “get away with it,” he said, unless SAG-AFTRA can secure the AI ​​protections it’s fighting for.

“It reminds me of a lot of rehearsals in the ’80s, ’90s and 2000s where there were a lot of people going around sampling classic songs,” he said. “This is an art. If you don’t protect the rights to their likeness, or their voice, or their body, and you don’t go now, then you can’t protect people from other efforts.”

Related Articles

Check Also
Close
Back to top button