“Pay for UK performers has stagnated despite games being a multibillion-dollar industry”
UK arts and entertainment union Equity have unveiled a raft of “best practice guidelines” for video game developers hiring voice actors, including some suggested minimum rates that are designed to address “systemic low pay” for performers. Other measures are designed to improve voice actor working conditions, and stop companies using their voices and likeness as fuel for generative AI tools without their consent. It’s both a praiseworthy endeavour and an interesting breakdown of the voice-actor’s trade.
“The videogames industry is a dynamic place to work but unethical practices undermine the profession,” Equityexplain on their official site. “Pay for UK performers has stagnated despite games being a multibillion-dollar industry with almost £200 million in tax breaks. Performers don’t have the protections they need in the unregulated world of AI, the misuse of NDAs is common and health and safety is often lacking.”
The best practice guidelines span fair pay, the need for voice actor consent when creating digital replicas or “training” AI software, provisions for safer workplaces (including guidance about intimacy coordinators and vocal chord stress), and stipulations about “clear, legal, enforceable, fair, proportionate and targeted” non-disclosure agreements.
As regards NDAs, Equityarguesthat: “At the moment, many of the NDA agreements we see reflect poorly upon the industry, often intimidating and isolating Performers from those that support them. Performers are forced to sign documents which they have no hope of understanding or amending.
“The result is that Performers often assume that they will be sued if they tell anyone anything about the production, even where they have been victims of or witnesses to a criminal offence,” the page continues. “This is all the more shocking five years since the Harvey Weinstein scandal highlighted the appalling misuse of NDAs, shook the film industry, and ignited the #MeToo movement. By now, we would have expected Engagers to be following best practice and not repeat past mistakes.”
I like hearing about production niceties such as these. Now that I’ve learned the term, I find myself surprisingly laden with memories of walla voicework. There’s a backing track of anomalous battlefield mitherings you hear during loading breaks forTotal War: Warhammer3, for example. I had that stuck in my head for a while, albeit partly because my review PC at the time was a sclerotic antique, and it sometimes took five minutes to load a battle.
As Eurogamer’s EdNightingalereminds us, Equity’s best practice guidelines come amid strike action from US actor’s unionSAG-AFTRA. In July, SAG-AFTRA spokespeople commented that"AI protections remain the sticking point” for the strike, followingdisagreements between the union and members about deals with individual companies who use the tech. “Eighteen months of negotiations have shown us that our employers are not interested in fair, reasonable AI protections, but rather flagrant exploitation,” Interactive Media Agreement Negotiating Committee chair Sarah Elmaleh commented at the time.
“Atmospheric” and “walla” voice-acting, Equity observe in their new guidelines, are “one of the segments of our industry most threatened by AI”, presumably because the uncanniness of AI-generated speech is harder to detect when it’s buried in a crowd. While not participating in the SAG-AFTRA strike for legal reasons, Equity have plenty of advice and requests for developers who are thinking about making use of generative AI.
Amongst other things, they suggest that companies confirm in advance whether “the data recorded within the stipulated performance sessions will be used for the stated project only and not re-used in future titles” and that “a pre-purchase / integration fee should be paid when a developer, studio or publisher wishes to hold performance data in their ‘library’ for potential re-use on future projects”.
If you found all this intriguing, I encourage you to read the full text over on Equity’s site - this is just a scraping of its surfaces. You might also be intrigued byKen Levine’s thoughts on “turn-based dialogue”. If you’d like to read more about the broad implications of generative AI technology, Mike Cookdid a whole essay series on it earlier this year.