In a letter to The Times, a coalition of over 1,000 artists — including Kate Bush, Sir Paul McCartney and Andrew Lloyd Webber — condemns reforms that would let AI companies train on copyrighted works without permission.
They warn that the proposals amount to a “copyright heist”, handing Big Tech a free pass to exploit creative content, jeopardising the future of the industry.
Meanwhile, ministers insist the changes are needed to boost the UK’s AI sector and turn it into an AI superpower, though mounting pressure from the creative industries has left the outcome uncertain.
Industry backlash
The backlash against the UK Government’s proposed AI copyright reforms isn’t just coming from musicians.
More than 30 leaders from major performing arts institutions, including the National Theatre, Opera North and the Royal Albert Hall, have warned that these changes could destabilise the industry by eroding the rights of freelancers who rely on copyright to sustain their careers.
In a joint statement from The Guardian, the leaders highlighted that the arts depend on a “fragile ecosystem” of independent creatives, many of whom have spent years honing their craft.
They argue that AI companies should not be given free rein to use copyrighted material without permission, calling on the government to uphold the moral and economic rights of performers, writers and artists across music, dance, drama and opera.
While these organisations support technological innovation, they caution that the reforms risk sidelining creatives from the development of AI, rather than empowering them to participate in it.
They also urge the government to impose transparency rules on AI developers, ensuring they disclose what copyrighted material they train their models on and how it was obtained.
Current laws
At the time of writing, UK copyright law provides artists full control over their work. Unlike the proposed reform, this means that AI companies require permission before using it to train their models.
If the government’s proposal was to go ahead, creators would have to actively “opt-out”, making it easier for tech companies to use music, writing and artwork without direct consent.

For artists, that’s a problem. If AI can scrape their work for free, what happens to their ability to earn a living? It plays up to the “copyright heist” claim, allowing Big Tech to profit from creative content without paying the people who made it.
In turn, this could make it even harder for musicians, authors and filmmakers to sustain their careers, especially as AI-generated content floods the market.
Exploitation or innovation?
Beyond the immediate impact on artists, this raises bigger legal and ethical questions about the future of creative industries. If AI models can replicate human-created content without permission, does that diminish the value of original work?
Should AI-generated content that mimics human creativity be classified as a derivative work under copyright law? How can regulations balance AI-driven innovation with the rights of the individuals who produce the material that fuels it?
This debate is not unique to the UK. In the EU, the AI Act includes transparency requirements that force companies to disclose which copyrighted materials their models have been trained on.
By contrast, the UK government appears to be taking a more lenient approach with the aforementioned “opt-out” system. One that would allow AI firms to train on copyrighted works by default unless creators explicitly objected.
Following strong industry opposition, the plan was put on hold, but it has not been scrapped. From a regulatory perspective, this fight is over intellectual property rights and legal protections.
Critics argue that an “opt-out” system would rewrite copyright law in a way that benefits AI companies while forcing creators to take extra steps to protect their work, sparking debate around ethics and fairness.
Without proper safeguards, AI training could become a form of exploitation, allowing companies to profit from creative labour without credit or compensation.
Yet AI firms argue that their models do not copy but create new, transformative works based on broad patterns in data. Some claim that AI-generated content enhances creativity rather than replacing it, offering artists new tools for innovation.
Proponents also point to economic benefits. The UK government believes a more relaxed copyright regime could attract AI investment, positioning the UK as a global leader in AI development.
AI vs artists
This legal battle isn’t just about loopholes for UK artists, it’s about their livelihood.
When e.motion’s AI dance track ‘Somebody Else’ went viral, Lily Allen wanted clarity on whether the voice was AI-generated or if it was hers and Fred Again’s. AI is already blurring the lines between human and machine-made creativity.
Many fear that without stronger protections, their work could be exploited with little to no benefit to them. If AI can be trained on an artist’s songs, writing or artwork without permission, it could swamp the market with cheap, AI-generated alternatives.
For musicians, that could mean fewer royalties. For writers, it could mean AI churning out copycat versions of their work. For the creative industry, it could mean a race to the bottom, where originality is undervalued in favour of AI-generated efficiency.
That’s why big names and large institutions are pushing back. They see a world in which AI companies benefit from their work without giving anything in return, which they argue is nothing short of exploitation.
Conversely, the government suggests that loosening copyright rules will fuel AI innovation and make it a powerhouse in the UK AI sector, but critics say that shouldn’t come at the expense of the £126 billion creative sector.
With pressure mounting from industry’s biggest voices, the question is whether ministers will listen or if artists will be left fighting an uphill battle. Ideally, creators’ rights in the digital age of AI are protected, without shutting the door to innovation.
Chelsea FC among leaders crafting ultimate fan journey with AI