The BFI has released a new report on the use of generative AI tech in the screen sector.
The report, AI in the Screen Sector: Perspectives and Paths Forward, states that sources of AI training data include scripts from more than 130,000 films and TV shows, YouTube videos, and databases of pirated books.
The report warns that as generative models learn the structure and language of screen storytelling – from text, images and video – they can then “replicate those structures and create new outputs at a fraction of the cost and expense of the original works.” It says that while these learned capabilities can be used to assist human creatives, “AI tools may also be used to compete against the original creators whose work they were trained on.”
A coalition of screen sector organisations, including the BBC, Channel 4, Fremantle, ITN, ITV and Pact believes that AI developers should not scrape creative sector content without express permission and that a framework that supports licensing of copyright content for AI training is the best way for the UK to share in the opportunity created by AI.
The report also raises AI’s ability to automate tasks that could lead to job losses, “particularly for junior or entry-level positions’ as well as the high energy consumption necessary to run AI models.
The report focuses on nine recommendations for the next three years that will “enable the UK screen sector to remain in the vanguard of innovation.”
The recommendations include establishing the UK as a world-leading market of IP licensing for AI training; embedding sustainability standards to reduce AI’s carbon footprint; and supporting cross-disciplinary collaboration to develop market-preferred, culturally inclusive AI tools. The report calls for structures and interventions to pool knowledge, develop workforce skills and target investments at the UK’s high-potential creative technology sector. Finally, it urges support for independent creators through accessible tools, funding and ethical AI products. Case studies within the report show how and where AI technologies have already become assimilated into production processes and where advancements are likely to be made.
Recommendation 1
Rights: Set the UK in a position as a world-leading IP licensing market
There is an urgent need to address copyright concerns surrounding generative AI. The current training paradigm – where AI models are developed using copyrighted material without permission – poses a direct threat to the economic foundations of the UK screen sector. A viable path forward is through licensing frameworks: 79 licensing deals for AI training were signed globally between March 2023 and February 2025; the UK’s Copyright Licensing Agency is developing a generative AI training licence to facilitate market-based solutions; and companies such as Human Native are enabling deals between rightsholders and AI developers. The UK is well-positioned to lead in this space, thanks to its ‘gold standard’ copyright regime, a vibrant creative technology ecosystem, and a coalition of creative organisations advocating for fair licensing practices. For this market to be effective, new standards and technologies are required, as outlined in a May 2025 CoSTAR National Lab report. By formalising IP licensing for AI training and fostering partnerships between rightsholders and AI developers, the UK can protect creative value, incentivise innovation, and establish itself as a hub for ethical and commercially viable AI-supported content production.
Recommendation 2
Carbon: Embed data-driven guidelines to minimise carbon impact of AI
Generative AI models, particularly large-scale ones, demand significant computational resources, resulting in high energy consumption and associated carbon emissions. Yet the environmental footprint of AI is often obscured from end users in the creative industries. Transparency is a critical first step to addressing AI’s environmental impact. UK-based organisations such as Blue Zoo are already choosing to run AI models on infrastructure where energy sources and consumption are fully visible. These practices, combined with calls for regulatory frameworks akin to appliance energy labels, demonstrate a need for sustainability-focused AI guidelines. With the screen sector in the vanguard of generative AI uses globally, it is ideally positioned to push the demand for carbon minimisation, and the UK screen sector should lead by example.
Recommendation 3
Responsible AI: Support cross-discipline collaboration to deliver market-preferred, ethical AI products
Generative AI tools must align with both industry needs and public values. Many models, tools and platforms have been developed without sufficient input from the screen sector (or, indeed, screen audiences), leading to functionality and outputs that are poorly suited to production workflows or that risk cultural homogenisation and ethical oversights. (Use of large language models trained predominantly on US data may marginalise local narratives, for example.) Academics have called for ‘inclusive’ approaches to AI development, arguing that generative AI’s full potential can only be reached if creative professionals participate in its development. The feasibility of cross-disciplinary collaboration is demonstrated by Genario – a screenwriting tool created in France by a scriptwriter and an AI engineer. Embedding collaborative, inclusive design processes can enhance the relevance of AI tools to creative tasks, as demonstrated by Microsoft’s Muse experiment. These processes also ensure that AI models reflect ethical standards and cultural diversity. The UK should look to combine its strengths in AI and humanities research, and its reputation for merging technology and culture, to deliver responsible, ethical AI.
Targeted support
Recommendation 4
Insight: Enable UK creative industry strategies through world-class intelligence
The UK has over 13,000 creative technology companies and a strong foundation in both AI research and creative production. However, across the UK screen sector, organisations, teams and individuals – especially SMEs and freelancers – lack access to structured intelligence on AI trends, risks, and opportunities. This absence of shared infrastructure for horizon scanning, knowledge exchange, and alignment limits the sector’s ability to respond cohesively to disruption. The BFI has proposed creating an ‘AI observatory’ and ‘tech demonstrator hub’ to address this urgent challenge, and the proposal has been endorsed by the House of Commons Culture, Media and Sport Committee as a way to centralise insights from academia, industry, and government, and provide hands-on experience of emerging tools and capabilities.
Recommendation 5
Skills: Develop the sector to build skills complementary to AI
AI automation may, in time, lower demand for certain digital content creation skills. It may also create new opportunities for roles that require human oversight, creative direction, and technical fluency in AI systems. Our research identifies a critical shortfall in AI training provision: AI education in the UK screen sector is currently more ‘informal’ than ‘formal’, and many workers – particularly freelancers – lack access to resources that would support them to develop skills complementary to AI. However, the UK is well-positioned to lead in AI upskilling due to its strong base of AI research institutions, a globally respected creative workforce, and a blending of technology and storytelling expertise. By helping workers transition into AI-augmented roles, the UK can future-proof its creative workforce and maintain its competitive edge in the global screen economy.
Recommendation 6
Public transparency: Drive increased public understanding of AI use in screen content
Transparency will drive audience trust in the age of generative AI. Surveys reveal that 86% of British respondents support clear disclosures when AI is used in media production, and this demand for transparency is echoed by screen sector stakeholders, who call for standards on content provenance and authenticity to counter the rise of AI-generated misinformation and ‘slop’. National institutions such as the BBC are already experimenting with fine-tuning AI models to reflect their editorial standards, and the BFI is deploying AI in archival work with a focus on ethical and transparent practices. These efforts demonstrate the UK’s capacity to lead in setting audience-facing standards and educating the public about generative AI’s new and developing role in content creation.
Growth
Recommendation 7
Sector adaptation: Boost the UK’s strong digital content production sector to adapt and grow
The UK boasts a unique convergence of creative excellence and technological innovation, with a track record of integrating emerging technologies into film, TV, and video game production. London is the world’s second largest hub (after Mumbai) for VFX professionals. Generative AI is already being used across the UK screen sector to drive efficiencies, stimulate creativity, and open new storytelling possibilities – from AI-assisted animation (Where the Robots Grow) and visual dubbing (Flawless) to reactive stories and dialogue (Dead Meat). However, surveys identify a lack of AI training and funding opportunities, while Parliamentary committees point to fragmented infrastructure and an absence of industry-wide standards that could hinder the continued growth and development of AI-supported creative innovation. Our own roundtable discussions with the sector highlighted the need for resources to better showcase the R&D work of the sector, to support collaboration and reaching new investors.
Recommendation 8
Investment: Unlock investment to propel the UK’s high-potential creative technology sector
There is a compelling opportunity and a pressing need for targeted financial support for the UK’s creative technology sector. The UK is home to global creative technology leaders including Framestore and Disguise, as well as AI startups such as Synthesia and Stability. However, the House of Lords has identified a “technology scaleup problem” in the UK, with limited access to growth capital, poor infrastructure, and a culture of risk aversion acting as barriers to expansion. A Coronation Challenge report on CreaTech points to “significant” funding gaps at secondary rounds of investment (Series B+ stages) which are “often filled by international investors … creating risks of IP and talent migration out of the UK”. The report also found that physical infrastructure is needed, stating that: “Those involved in CreaTech innovation can struggle to find space to demonstrate, and sell, their work.” Commenting on a February 2025 House of Lords Communications and Digital Committee report into the scaleup challenge, inquiry chair Baroness Stowell called for action to “unravel the complex spaghetti of support schemes available for scaleups” and “simplify the help available and ensure it is set up to support our most innovative scaleups to grow”.
Recommendation 9
Independent creation: Empower UK creatives to develop AI-supported independent creativity
Generative AI is lowering traditional barriers to entry in the UK screen sector – enabling individuals and small teams to realise ambitious creative visions without the need for large budgets or studio backing. UK-based director Tom Paton describes how AI breaks down barriers that have “kept so many creators on the sidelines”, while the Charismatic consortium, backed by Channel 4 and Aardman Animations, sees the potential of AI “to support creators disadvantaged through lack of access to funds or the industry to compete with better funded organisations”. The emergence of AI-first studios such as Wonder, which secured £2.2 million in pre-seed funding, further demonstrates the viability of independent, AI-supported content creation. By investing in accessible tools, training, and funding for independent creators, and developing market-preferred, ethical AI products, the UK can foster a more inclusive and dynamic creative economy where AI enhances, rather than replaces, human imagination.
AI in the Screen Sector: Perspectives and Pathways is available at bfi.org.uk/ai-screen-sector and www.costarnetwork.co.uk
Staff Reporter
Share this story