Judge Preliminarily Approves Settlement in Anthropic Copyright Case
Our Privacy Policy has been updated! The Conference Board uses cookies to improve our website, enhance your experience, and deliver relevant messages and offers about our products. Detailed information on the use of cookies on this site is provided in our cookie policy. For more information on how The Conference Board collects and uses personal data, please visit our privacy policy. By continuing to use this Site or by clicking "ACCEPT", you acknowledge our privacy policy and consent to the use of cookies. 

CED Newsletters & Policy Alerts

Timely Public Policy insights for what's ahead

Action: In September, Judge William Alsup, Senior District Judge in the Northern District of California, rejected a proposed settlement in a class action lawsuit alleging that AI developer Anthropic violated Federal copyright laws when it used books copied from both pirated and purchased sources in training its AI models. In denying the motion, Judge Alsup stated that the settlement lacked sufficient detail about the claims process and Anthropic’s legal liability for claims going forward. Judge Alsup has now preliminarily approved the settlement as originally proposed after the plaintiffs submitted an allocation plan outlining how funds would be distributed and addressed other concerns.     

Trusted Insights for What's Ahead®

  • The case concerns one of the core issues in AI model development – the intellectual property rights of the individuals and firms that create the content on which the models are trained. In his June decision, Judge Alsup ruled that Anthropic’s use of copyrighted books purchased legally constituted fair use, comparing it to a human reading a book, then later drawing on it in forming new ideas. However, the judge declined to grant the company’s assertion that its use of pirated books it collected over the internet illegally was also fair use and indicated that the question would go to trial.
  • In reaching a settlement, Anthropic avoided significant legal and financial risk, as the copyright statute permits fines of up to $150,000 per intentional copyright violation, and the company’s library of pirated books contains more than 7 million titles.
  • Under the settlement, the company will pay $1.5 billion to the class, which includes about 500,000 authors. As outlined in the allocation plan, each book covered by the settlement would earn a $3,000 payout split evenly between the authors and publishers with special provisions governing educational works. Anthropic will also be required to delete the millions of books that it obtained illegally.
  • Critics of the settlement note that while significant, the $1.5 billion amount represents less than 1 percent of the company’s current valuation of $183 billion and may not be large enough to deter future violations. The Judge’s determination that the use of books obtained legally represents fair use also represents a major victory for AI model developers.
  • AI developers continue to face a variety of legal challenges related to intellectual property issues. For example, Anthropic is in ongoing litigation with music publishers related to allegations that it did not take sufficient steps to prevent its users from infringing on copyrighted lyrics when using the AI model to generate new music.

Judge Preliminarily Approves Settlement in Anthropic Copyright Case

October 16, 2025

Action: In September, Judge William Alsup, Senior District Judge in the Northern District of California, rejected a proposed settlement in a class action lawsuit alleging that AI developer Anthropic violated Federal copyright laws when it used books copied from both pirated and purchased sources in training its AI models. In denying the motion, Judge Alsup stated that the settlement lacked sufficient detail about the claims process and Anthropic’s legal liability for claims going forward. Judge Alsup has now preliminarily approved the settlement as originally proposed after the plaintiffs submitted an allocation plan outlining how funds would be distributed and addressed other concerns.     

Trusted Insights for What's Ahead®

  • The case concerns one of the core issues in AI model development – the intellectual property rights of the individuals and firms that create the content on which the models are trained. In his June decision, Judge Alsup ruled that Anthropic’s use of copyrighted books purchased legally constituted fair use, comparing it to a human reading a book, then later drawing on it in forming new ideas. However, the judge declined to grant the company’s assertion that its use of pirated books it collected over the internet illegally was also fair use and indicated that the question would go to trial.
  • In reaching a settlement, Anthropic avoided significant legal and financial risk, as the copyright statute permits fines of up to $150,000 per intentional copyright violation, and the company’s library of pirated books contains more than 7 million titles.
  • Under the settlement, the company will pay $1.5 billion to the class, which includes about 500,000 authors. As outlined in the allocation plan, each book covered by the settlement would earn a $3,000 payout split evenly between the authors and publishers with special provisions governing educational works. Anthropic will also be required to delete the millions of books that it obtained illegally.
  • Critics of the settlement note that while significant, the $1.5 billion amount represents less than 1 percent of the company’s current valuation of $183 billion and may not be large enough to deter future violations. The Judge’s determination that the use of books obtained legally represents fair use also represents a major victory for AI model developers.
  • AI developers continue to face a variety of legal challenges related to intellectual property issues. For example, Anthropic is in ongoing litigation with music publishers related to allegations that it did not take sufficient steps to prevent its users from infringing on copyrighted lyrics when using the AI model to generate new music.

More From This Series

Newsletters & Alerts
Newsletters & Alerts
Newsletters & Alerts
Newsletters & Alerts
Newsletters & Alerts