The Chair of Parliament’s Culture, Media & Sport Select Committee, Caroline Dinenage, has written to ministers Lisa Nandy and Peter Kyle over reports that the new UK government might be re-considering introducing a copyright exception for the benefit of AI companies.
The previous government put forward a proposal for - but ultimately rejected - a text and data mining (or ‘TDM’) exception in copyright law that would have allowed technology companies to use copyright protected content for training AI models without seeking permission.
In her letter to the new government, Dinenage refers to that “disastrous policy proposal”, before adding “I have deep concerns that the new government is now seeking to resurrect this flawed notion of a ‘TDM exception’”.
Should such an exception be introduced, she says, it would “remove any motivation tech companies would otherwise have to work with the creative industries to devise commercial models that safeguard the incentives and reward for human creativity in the Al era”.
The copyright obligations of tech companies that use copyright protected content to train generative AI models remains under debate around the world. Many AI companies argue that they can rely on copyright exceptions relating to data mining under the copyright laws of at least some countries or the concept of fair use under American copyright law.
However, the music industry and other copyright industries are adamant that AI companies must get permission from copyright owners before ingesting any content.
That permission would require companies using copyright protected content to train AI to negotiate licensing deals with copyright owners, and would likely require payment of fees for use of that content. Those fees could potentially include onward revenues when outputs are generated by AI trained on licensed content.
In the UK, there is a copyright exception for text and data analysis, but only in the context of “non-commercial research”. In 2022, the government proposed introducing a wider text and data mining exception that could be used by commercial AI companies, but this was ultimately dropped after widespread backlash from the copyright and creative industries.
In Dinenage’s words, that exception “would have allowed AI developers to scrape creative works from the internet to train their systems without permission and without paying the human creators whose work AI seeks to emulate and compete with”.
After dropping the proposed copyright exception, the last government cobbled together a roundtable involving representatives from both the creative industries and the tech sector in the hope that a code of practice could be agreed around AI and copyright.
That initiative ultimately failed and now the new government is trying to work out an onward plan. That planning is being led by the government’s AI minister Feryal Clark, working alongside Peter Kyle, the Secretary Of State For Science, Innovation and Technology, and Lisa Nandy and her team at the Department For Culture, Media & Sport.
There has been speculation in recent weeks that a text and data mining exception for AI companies is being considered again, though probably more in line with the exception that already exists in European law. That allows copyright protected material to be used in AI training, but also allows copyright owners to opt out from the exception, meaning that their content cannot be used.
However, introducing that sort of ‘exception with opt-out’ in the UK is unlikely to win many fans on either side of the debate. Copyright owners will be strongly opposed to any new exception at all, even one that includes an opt-out or other restrictions.
For AI companies, the opt-out makes the exception fairly useless, particularly if they are looking to use commercially released content in a training dataset, because pretty much all major corporate copyright owners will opt out right immediately, with smaller copyright owners likely following suit, given the huge amount of discussion there is around these issues.
Indeed, those copyright owners that have opted out - or, to use the legal term, ‘reserved their rights’ - from the EU exception have usually done so in a way that anticipates and pre-emptively opts out of any further data mining exceptions that might be introduced in the future
Concluding her letter, Dinenage says “the only thing government actually needs to do in this area is oblige tech companies to be transparent about the creative work they are using” so that there can be a serious discussion about commercial models.
“While embracing the opportunities of AI”, she ends, “we must be confident that we are not undermining our cultural and creative industries, which are so fundamental to the success of our national economy and our soft power across the world”.