Meta uses pirated materials to train Llama AI

0
362
Meta uses pirated materials to train Llama AI

Meta knowingly used pirated materials to train its Llama artificial intelligence models – with the blessing of CEO Mark Zuckerberg – according to an ongoing copyright infringement lawsuit against the company. According to TechCrunch, the plaintiffs in the Kadrey v. Meta case have filed documents with the court alleging that the company used the LibGen dataset to train AI.

LibGen is commonly described as a “shadow library” that provides file-sharing access to academic and publicly available books, journals, images, and other materials. A lawyer for the plaintiffs, including writers Sarah Silverman and Ta-Nehisi Coates, accused Zuckerberg of approving the use of LibGen for education despite concerns from company executives and employees who described it as “a set of data they know is pirated.”

The company removed copyright information from LibGen’s materials, the complaint says, before handing them over to Llama. In a document filed with the court, Meta apparently admitted that it had “removed all copyright paragraphs from the beginning and end” of the scientific journal articles. One of the company’s engineers even created a script to automatically remove copyright information. The lawyer argued that Meta did this to hide its copyright infringement activities from the public. In addition, the lawyer mentioned that Meta admitted that it had torrented LibGen materials, although its engineers did not want to share them “from a [Meta] corporate laptop.”

In 2023, Silverman, along with other authors, sued Meta and OpenAI for copyright infringement. They accused the companies of using pirated materials from shadow libraries to train their AI models. The court had previously rejected some of their claims, but the plaintiffs said their amended complaint supported their allegations and took into account the court’s previous reasons for dismissing the lawsuit.

LEAVE A REPLY

Please enter your comment!
Please enter your name here