Actors’ union, Equity, is to hold an indicative industrial action ballot for its members working in film and TV on the issue of artificial intelligence protections.
It will be the first time the performing arts and entertainment trade union has asked this whole section of its membership, encompassing those working in film and TV, to vote in a ballot, indicative or otherwise.
Performers will be asked whether they are prepared to refuse to be digitally scanned on set in order to secure adequate artificial intelligence protections. The ballot will open today (Thursday 4 December) and run until Thursday 18 December.
As the ballot is indicative, it will show the level of support the union has for this action short of a strike. But it is not binding, nor would it legally cover members who refuse to be digitally scanned on set – for this, a statutory ballot would need to be held, which would constitute a possible next step for the union.
Equity is currently in negotiations with producers’ trade body, Pact. They are trying to determine a new agreement to set minimum pay, terms and conditions for actors, dancers and stunt performers working in film and TV.
Alongside issues such as pay, secondary payments (royalties and residuals), self-tapes (performers submitting recordings of themselves as part of the audition process), and hair and make up provisions for the global majority, artificial intelligence is a key issue for Equity members.
Equity says members are increasingly concerned about the use of their voice and likeness, including being digitally scanned on set. Equity is fighting for protections for performers based on the principles of “explicit consent, transparency of terms, and fair remuneration for usage. “
Equity says that so far the two bodies have made “significant progress” in negotiations on protecting performers’ rights when it comes to working with digital replicas (digital copies of real performers) and synthetic performers (artificially generated performers). “However, a major section of Equity’s claim remains unaddressed regarding the use of data, such as recorded performances or digital scans, to train AI systems.”
Equity is arguing that producers, content owners or any third party should not be using performers’ data for this purpose without informed consent. “But Pact has not responded with adequate contractual assurances on this matter.”
Equity General Secretary, Paul W Fleming, says: “While tech companies get away with stealing artists’ likeness or work, and the Government and decision-makers fret over whether to act, unions including Equity are at the forefront of the fight to ensure working people are protected from artificial intelligence misuse.
“It is through union-negotiated agreements that set minimum pay, terms and conditions, that we can collectively ensure performers’ AI rights are protected.
“So it is disappointing that Pact is still not agreeing to protect our members when it comes to training AI. If bosses can’t ensure someone’s likeness and work won’t be used without their consent, why should performers consent to be digitally scanned in the first place?
“This indicative ballot gives Equity members an opportunity to send a clear message to the industry: that it is a basic right of performers to have autonomy over their own personhood and identity.
“Nobody wants further instability in our industries ahead of what we hope will be a positive year in 2026. However, with the inadequacy of the deal on the table, Equity has no choice but to recommend members support industrial action. It’s time for the bosses to step away from the brink and offer us a package, including on AI protections, which respects our members.”
A Pact spokesperson said: “There are two elements to this: Outputs (such as de-aging actors in edit, performance altering, digital replicas, etc), they are not in dispute at all, we’ve agreed a mechanism for consent and payments where due. Then Inputs (compiling data to create LLMs), where that is done in order to help production (e.g. design lighting, archiving etc), that is not a problem. The issue is when you take all that data and you effectively are creating more content with it. None of our members are doing that at the moment. They are not selling it or monetising data in this way. Equity wants future facing protections and we don’t know what the future looks like. We’ve said that we’ll have a dialogue as things develop so that we can have informed discussion about protections and monetisation. Of course, members have scanned actors for many years (long before AI was used in production). Producers are well aware of their obligations under data protection law.”
(Photo Credit: Mark Thomas)
Jon Creamer
Share this story












