Advertisement

Training Without Consent is Risky Business: What Business Owners Need to Know About the Proposed AI Accountability and Data Protection Act

Sandwish-stock.Adobe.com

AI is a great tool to enhance human creativity through designing and creating new content, creating new and enhancing existing products and a myriad of other ways, but the landscape regarding whether content is protectable or could give rise to liability has been constantly changing, which has made it challenging for companies to properly address the risk of noncompliance with current law.

On July 21, 2025, Senators Josh Hawley (R-MO) and Richard Blumenthal (D-CT) introduced the AI Accountability and Data Protection Act (the “AI Act”). Most significantly, the AI Act proposes the creation of a new federal cause of action for individuals, against companies training AI models using individuals’ personal or copyrighted data without their affirmative consent. If it passes, the AI Act will have significant ramifications for companies leveraging AI tools as well as the creators of copyrightable content.

State of Play in AI Copyright Litigation

The tension inherent with the creative use of technology and copyright law is longstanding, but the rise of generative artificial intelligence, particularly through large language models (LLMs), has raised even more complex questions about ownership of creative content. The trajectory of litigation against large AI companies that are using LLMs has largely hinged on judicial interpretations of “fair use,” a defense that under copyright law means that the defendant is claiming they are entitled to use a copyrighted work without the copyright owner’s permission for certain traditionally non-commercial purposes. This defense under the Copyright Act applies to situations where the public’s right to or benefit from a copyrighted work outweighs the rights of its owner.

Thus far, courts have found that the training of LLMs is indeed fair use, albeit for differing reasons and not in all contexts. Most recently, a plaintiff group of authors settled litigation against Anthropic after a court in the Northern District of California found that, while Anthropic’s use of lawfully acquired books to train its LLM could be considered fair use, preserving copies of pirated books was not, and constituted infringement. Bartz v. Anthropic PBC, No. 24-cv-05417 (N.D. Cal.).

Advertisement

Liability for Anthropic on the basis of copyright infringement was potentially in thebillions of dollars, and on September 5, 2025, the parties announced that the case had settled for an eye-popping $1.5 billion. So far only one court, in Thomson Reuters v. Ross Intelligence, No. 1:20-cv-613-SB (D. Del.), has found that the commercial benefit from training AI models outweighs the fair use defense of their output – and the model there, unlike most of the others currently in high-profile litigation, was not generative.

Core Terms of the AI Act

The AI Act would remove the defense of fair use by creating a federal cause of action for the “appropriation, use, collection, processing, sale, or other exploitation of individuals’ data without express, prior consent.” These actions explicitly include training a generative AI system and the generation of covered data. “Covered data,” in addition to personally identifiable information, includes data “generated by an individual and [] protected by copyright.”

“Generation” includes content that “imitates, replicates, or is substantially derived from” the covered data, a concept far broader in scope than merely using copyrighted or other protected works in training. Generation could extend to nearly any AI output traceable to a protected source. Importantly, the AI Act explicitly does not restrict copyrighted works to registered works – unlike the Copyright Act, which requires registration of the copyrighted work before an individual may initiate a copyright infringement lawsuit under the Copyright Act.

The AI Act allows recovery of compensatory damages equal to the greatest of actual damages, treble profits, or $1,000, as well as punitive damages, injunctive relief and the collection of attorney’s fees and costs. It also allows for secondary liability for an entity that “aided and abetted another person” in the enumerated list of actions. In short, the AI Act provides multiple significant avenues of recovery if consent is not obtained prior to use of the covered data.

What Does the AI Act Mean for the Creation of New Content?

If passed, the AI Act would strengthen enforcement rights of individual copyright owners, including businesses owning copyrights covering the design of their goods or content, to enforce against companies using those works for training of their AI models or other AI tools. Currently, enforcement requires registration of the copyright on a product design. For example, the AI Act allows the owner of a hat design to enforce its rights against an entity that used it for their AI tool without first securing a registration.

Companies should seek guidance on whether content is protectable and implement monitoring systems to detect infringement of that content through its use in AI tools, since they may now have the opportunity to enforce their rights if their content is used improperly under the AI Act. Additionally, if designers use AI to create retail products without acquiring proper permission, retailers selling those products could be responsible under the AI Act without proper indemnification from the designer or manufacturer.

What Does the AI Act Mean for Companies Using AI Tools?

The AI Act would significantly increase risk of liability for companies currently leveraging AI tools that rely on the covered data in their product design, marketing and social media campaigns, or other applications. In-house legal teams at companies leveraging AI tools should act now to:

  • Review and strengthen policies before the AI Act becomes law to include best practices for acquiring the “express, prior consent” required to use copyrighted works or other protected data.
  • Audit training datasets and data collection practices to identify use of covered data – namely, content that could be subject to copyright protection, or other personal data – without the required level of consent.
  • Ensure consent according to the AI Act is “freely given, informed, and unambiguous” whether through licenses, website terms or other agreements.
  • Disclose all entities with access to the data as part of the consent process. Retailers and marketers must confirm that any content they acquire is used with explicit consent from the owner.

Conclusion

Although the AI Act has not yet progressed out of the Judiciary Committee, companies should prepare now by reviewing and enhancing their AI-related processes, including the acquisition of consent for the ingestion, training or other use by those AI tools of any potentially protectable data. Content owners should also strengthen their monitoring efforts to detect unauthorized use by others.

Retailers, in particular, should verify that they are acquiring the appropriate permissions, and ensure that designers, influencers and other content creators are doing the same before using or training AI tools.


Nicholas A. Rozansky is a partner at Brown Rudnick LLP who frequently addresses brand and reputation management and related intellectual property issues. His clients span industries including fashion and apparel, jewelry, banking and finance, consumer products, and sports and entertainment. As a leading litigator, he advises on pre-litigation and litigation matters, risk avoidance, collaboration deals, IP protection and business strategies.

Feature Your Byline

Submit an Executive ViewPoints.

Featured Experience

Get ready for the holidays with the Holiday ThinkTank! Find must-read articles, webinars, videos, and expert tips on everything from trends to marketing, in-store ideas, ecomm, fulfillment, and customer service. It’s all free and available anytime—so you can plan, prep, and win the season your way.

Advertisement

Access The Media Kit

Interests:

Access Our Editorial Calendar




If you are downloading this on behalf of a client, please provide the company name and website information below: