Over 15,000 creators - more than 1000 of whom are music creators, including musicians like Thom Yorke, Nitin Sawhney and Aurora - have endorsed a short statement demanding that AI companies must get permission before using existing content to train generative AI models.
That statement reads, “The unlicensed use of creative works for training generative AI is a major, unjust threat to the livelihoods of the people behind those works, and must not be permitted”.
Creators - and organisations across the creative industries - are now being encouraged to sign the statement via a bespoke website.
It has all been organised by Ed Newton Rex, the former VP Of Audio at Stability AI, who resigned from the company saying that he disagreed with Stability’s stance that AI training is ‘fair use’ under American copyright law, meaning existing content can be used without getting permission from creators and copyright owners.
He has since become very outspoken against that position, which is taken by many AI companies, and has set up the organisation Fairly Trained to verify and showcase AI companies which only train their models using content where they have secured copyright owner permission.
Setting out why he believes AI companies must get permission before using existing content as training data, Newton-Rex told The Guardian, “There are three key resources that generative AI companies need to build AI models: people, compute and data. They spend vast sums on the first two - sometimes a million dollars per engineer and up to a billion dollars per model. But they expect to take the third - training data - for free”.
“When AI companies call this ‘training data’, they dehumanise it”, he added. “What we’re talking about is people’s work – their writing, their art, their music”.
Outside the US, many AI companies argue that they can rely on the text and data mining exceptions found in some copyright systems to make use of copyright protected works without getting permission.
In the UK, the current copyright exception covering text and data analysis only applies to non-commercial research, but at one point the last government considered expanding it to apply to all AI companies. That plan was abandoned, but there have been reports that the new Labour government is now considering something similar.
That prompted the Chair of Parliament’s Culture, Media & Sport select committee, Caroline Dinenage MP, to write to ministers Lisa Nandy and Peter Kyle expressing deep concerns that “the new government is seeking to resurrect this flawed notion of a ‘text and data mining exception’”.
The copyright industries - including the music industry - are adamant that AI companies should have to get permission to use existing works. In doing so, those companies have to negotiate licensing deals with copyright owners, allowing the copyright owners to secure upfront and, possibly, ongoing payments.
Which is why individuals, companies and organisations from across the music industry have all endorsed the statement put together by Newton Rex.
That said, there also remains disagreement within the music community over whose permission is required. In the music industry, it is often record labels and music publishers which control the copyright in recordings and songs. There has been much debate as to whether that means labels and publishers can unilaterally grant permission to those AI companies seeking licences, or whether specific permission is also required from individual artists and songwriters.
The five organisations that make up the Council Of Music Makers - the Featured Artists Coalition, Ivors Academy, Musicians’ Union, Music Producers Guild and Music Managers Forum - all argue that music-maker consent is also required, which is something they stressed while welcoming the statement organised by Newton Rex.
They said they “wholeheartedly endorse” the statement, but then hone in on its reference to “the people behind the works”. They then add, “To be clear, the people behind the music are the music-makers. It is paramount that explicit consent is sought from music-makers before their music is used to train AI - including by rightsholders when negotiating licensing deals with AI companies - and music-makers must be fairly compensated from use of their work in this way”.
Other music industry organisations have also endorsed the Newton Rex organised statement, including Gee Davy at the Association Of Independent Music, who says “On behalf of the UK’s independent music community - businesses who are proud to work in partnership with artists - we support this statement from Fairly Trained”.
“To achieve the benefits of AI for creativity, we urge policymakers not to lose sight of the need for strong copyright protections”, she adds. “This is vital to ensure a healthy future for those who create, invest in and release music across genres and all communities, regions and nations of the UK”.
Sophie Jones, Chief Strategy Officer at BPI, adds, “while the British music industry is already embracing AI’s many positive use-cases, it is also our firm view that a broad copyright exception for text and data mining by AI firms would be hugely damaging to the UK’s creative industries”.
“Copyright serves to safeguard the value of human creativity, while also driving value in the wider music and creative industries”, she goes on. “If the UK is to remain a global creative powerhouse in an increasingly competitive world, the government must ensure that it is respected and enforced”.
To ensure that happens, she concludes, ministers should require AI firms “to seek authorisation before taking copyrighted content, coupled with transparency obligations, including record keeping, to enable a healthy and fair market to flourish”.