
The UK’s creative industries are rightly celebrated and envied the world over. From Harry Potter to The Beatles, generations of global audiences have grown up appreciating the work of British writers, musicians and filmmakers. It’s no surprise then, that the creative industries make an outsized contribution to the UK economy, generating over £120bn annually – more than the automotive, aerospace, life sciences, and oil & gas industries combined.
The rise of generative AI could put all of this at risk by raising a fundamental challenge: should tech companies be allowed to repurpose other people’s work, without permission or payment?
AI and creative industries: a crucial inflection point
Most creators welcome artificial intelligence. They recognise and, in many cases, are excited by the possibilities opened up by the technology and want to encourage innovation as a key driver of economic growth for the UK as a whole. In my own industry, for instance, it can be used by publishers to improve efficiency, help identify and reach new audiences and make sure our content is tailored to their tastes and needs. However, whilst many are embracing AI, all are concerned about the challenges it could pose to the sector if the tech firms which control it continue to abuse copyright law.
We are now at crucial inflection point. The UK government has put forward a proposal to introduce a new exemption to copyright law, which would give AI developers the green light to scrape and use publishers’ content without consent or compensation, unless they actively opt out. It puts the onus on the creators, rather than the technology platforms. It’s a deeply flawed idea, which in reality cannot work and risks fatally undermining the business models on which the UK’s creative economy depends.
Intellectual property theft on an industrial scale
AI tools are already ingesting vast amounts of content produced by writers and journalists, including articles published behind subscription paywalls, which is used to train AI systems that generate answers, summaries, and products which often directly compete with the original sources – diverting both readers and revenue away from the original creator. We should be deeply concerned about this. It is not innovation, it’s intellectual property theft on an industrial scale, and it’s already having a major impact.
In my industry, trusted editorial brands have built successful publishing businesses which rely on professional and quality fact-checked journalism. All this doesn’t come for free, and publishers’ models rely on copyright protections for their proprietary content. Strip those away, and the entire system begins to unravel. That puts at risk not just the content people know and love, but the jobs of the people who create it. Trusted editorial brands employ over 55,000 people across the UK, and the wider creative industries sector is responsible for over 2.4 million jobs.
We’re clear that the government’s proposed ‘opt out’ solution simply isn’t feasible. There is already regular evidence emerging that tech companies are routinely ignoring publishers’ requests be removed from their LLM training models even when they claim to offer an ‘opt out’ option. We have heard from our members first hand that their content is being used without their permission.
Make it Fair
That’s why the Professional Publishers’ Association (PPA), representing hundreds of consumer and B2B publishers across the UK, has submitted a clear and unequivocal response to the government’s consultation on this issue. Our message is simple: AI companies must play by the rules. They must seek permission. And they must pay for what they use.
It’s important to note that finding a model which is fair to publishers will also lead to better AI. Tech firms rely on the high-quality, fact-checked content of PPA members to reduce the hallucinations which do so much to undermine trust in their products. That means they should be incentivised to work with our members to ensure they support, rather than undermine trusted editorial brands’ business models. Instead of being at loggerheads with publishers, the tech sector should be collaborating with us to achieve an outcome that benefits all.
The publishing industry is not alone in making this call. As part of the Creative Rights in AI coalition, we joined forces with colleagues from news, music, and other impacted sectors to launch the Make It Fair campaign, making clear that the UK’s creative industries cannot be sidelined in the race to appease tech giants.
There is precedent here. When radio, television, and digital streaming entered the scene, content creators were compensated through licensing frameworks. AI should be no different. Licensing offers a sustainable path forward that rewards original creators and ensures that AI systems are built on high-quality, legally acquired content. In fact, there is already a precedence for how tech platforms are working with publishers. Apple News already split revenues 50:50.
This kind of fair revenue sharing lies at the heart of what a sustainable regulatory framework could look like. It’s vital that remuneration recognises the contributions of trusted editorial brands, and the government must ensure that AI giants don’t use their market power to pressure publishers into one-sided deals, exchanging visibility in search results for access to training data. Competition regulators should take note.
Ensuring a creative future
Ultimately, this is about more than the future of publishing. It’s about whether the UK wants to support a thriving creative economy or allow it to be mined, hollowed out, and replaced by low-quality, derivative content. AI can be a force for good, but only if built on fair terms.
Ministers will soon decide on the path they will take regarding copyright protections in the age of AI. The choices they make in the coming months will shape the creative landscape for years to come.
The UK has long been a leader in the creative industries. Let’s not trade that leadership for a short-term tech rush. Instead, let’s arrive at a solution which creates an environment where AI can enrich our lives alongside fair safeguards our creatives.