Article cover image: House of Lords pass transparency requirement amendment for AI training

House of Lords pass transparency requirement amendment for AI training

Lords voted 272 votes to 125 in favour of the amendment to the Data (Use and Access) Bill, which requires AI companies to reveal which copyrighted materials they have used to train their models.

On Monday evening, the House of Lords passed an amendment to give rightsholders meaningful transparency regarding the use of copyrighted works in AI training. We are glad to see that the House of Lords has recognised the concerns of creators across the country.   

The Government must now listen to these concerns. When the Data Bill returns to the Commons, we hope that the Government will acknowledge the urgent need for transparency in the development of AI models. With this significant show of support for creators in the House of Lords, we hope the Government will take this opportunity to support authors and the wider creative industries.  

Speaking in the Lords, ALCS Chair Lord Clement-Jones said: “The rapid development of AI, particularly Large Language Models, relies heavily on vast volumes of data for training. This has brought into sharp focus the way copyright law applies to such activity. Rightsholders, from musicians and authors to journalists and visual artists, are rightly concerned about the use of their copyrighted material to train AI models, often without permission or remuneration, as we have heard. They seek greater control over their content and remuneration when it is used for this purpose, alongside greater transparency.” 

Baroness Kidron said: “Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work and then rent it back from those who stole it. My lords, it is an assault on the British economy, and it is happening at scale to a sector worth £120bn to the UK, an industry that is central to the industrial strategy and of enormous cultural import.” 

Generative AI depends on the works of UK creators, but so far, most developers have used these works without the consent, remuneration or even awareness of authors. We must as a minimum begin to establish transparency requirements so that authors are able to exercise the rights they are granted to their work by copyright. 

Effective transparency regulation is an essential first step for a dynamic licensing market that gives authors control over their work and the option of remuneration if they choose, while providing developers with legitimate paths to the quality content necessary to their models. 

Our survey of members last year showed that while 91% of authors felt they should be asked permission for their works to be used to train AI models, 77% did not know whether their works had been used, highlighting the severe lack of transparency.


You can learn more about our campaigning work here.