Advertisement

What to Consider When Incorporating AI into Your Retail Business

AREE-stock.Adobe.com

Artificial intelligence has revolutionized the way businesses and individuals operate by using data to power tools that can analyze, respond, predict and create in a manner similar to the human brain. Today, it is hard to imagine any sector where AI has yet to be deployed in some capacity, including, increasingly, retail, which has turned to AI for everything from streamlining the supply chain to personalizing shopping experiences.

But as retailers expand their use of AI technologies, they must be mindful of the legal implications raised by these novel tools, particularly as the law governing training, use and deployment of AI technologies develops. When contemplating whether and how to incorporate AI into their business, retailers should consider the following:

Privacy concerns.
AI’s value lies in the wealth of data that powers its insights. However, mass data processing is a double-edged sword: Although it makes AI smarter, it also requires retailers to abide by a growing body of domestic and international privacy laws that have continually emerged and evolved over the last several years.

In particular, retailers should closely observe collection, retention, sharing, transfer and use requirements attaching to highly sensitive types of personal data, including identifiable physical or behavioral characteristics (like face, voice and body scans) that may constitute biometric data under certain omnibus privacy laws and biometric privacy laws in Illinois, Texas, and Washington. This is true both for the data used by the AI tool developer to pre-train the AI tools used by the retailer, and for data ingested, used or shared by the retailer through its own use of the tool.

Advertisement

The potential utility of biometric data to retailers is wide-ranging. Virtual try-on technologies that allow consumers to see themselves wearing an item before making a purchase are interesting but complicated. Other online retailers are using face and body data to generate new physical features and models to, among other objectives, increase diversity and size inclusivity in online product displays and browsing experiences.

In brick-and-mortar stores, retailers have implemented contactless checkout features that use facial recognition and other physical data to identify consumers and charge their payment method. These types of uses should be carefully implemented to ensure compliance with relevant privacy laws.

Retailers should also be mindful of non-biometric requirements applicable to the processing of physical and other personal data. For example, in certain jurisdictions, individuals may have a right of publicity in their likeness; indeed, bipartisan legislation currently before Congress would give individuals a property right in their voice and likeness, and would permit the creation of digital depictions of individuals only pursuant to an agreement where the individual was represented by counsel.

Finally, complexity increases for international retailers. Not only do they need to comply with varied jurisdictional requirements at home and abroad regarding collection, use and sharing of data, but they also need to comply with applicable cross-border data transfer requirements when sharing data among their international operations.

Bias concerns.
The quality of AI outputs is dependent, in part, on the quality of the tool’s training data. Accordingly, if an AI tool is fed biased data, it may produce biased results, leading to significant impacts on retailers and consumers.

One increasingly common way that bias can infect retail AI tools is through AI-powered loss prevention systems that monitor customer behavior, which have been adopted by a string of large retailers. Functionality differs between tools: some track customer movements and alert security of what the tool considers to be suspicious behavior, while others use facial recognition to identify repeat offenders from a database of known shoplifters, notifying the security team in case of a positive match.

The potential for bias in these systems is clear. In late 2023, the Federal Trade Commission (FTC) prohibited Rite Aid from using facial recognition technology for surveillance purposes in its stores for five years after finding that the pharmacy chain’s loss prevention system inaccurately and disproportionately identified people of color and women as potential shoplifters, leading to wrongful monitoring, searches and expulsions.

In its complaint, the FTC specifically cited that the tool relied on a database of tens of thousands of often low-quality images, some of which were pulled from security cameras, employee cellphone cameras, and news coverage. Retailers should exercise caution in building or onboarding similar AI-powered loss prevention tools to ensure that they are created and used without bias and with close human review.

Intellectual property concerns.
The use of AI tools in retail design also raises concerns related to the enforceability and infringement of retailers’ intellectual property rights. Intellectual property rights in retail design are already weak. In Star Athletica v. Varsity Brands, LLC, the U.S. Supreme Court ruled that copyright protection extended only to clothing features that could be perceived of and protected under copyright separately from the “useful article” of the garment itself.

The introduction of AI into the design process would likely only weaken protection. In August 2023, a federal court affirmed the U.S. Copyright Office’s rejection of an application for copyright registration, ruling that AI-generated works of art made without any human input cannot be copyrighted.

For patent protection — often sought by footwear and outdoor equipment brands for innovative technologies, processes and materials — similar constraints exist. In spring 2023, the Supreme Court declined to hear an appeal from a lower court ruling affirming the U.S. Patent and Trademark Office’s decision to refuse a patent application for a work autonomously created by an AI tool, holding that patents could only be issued to human inventors. Some question remains, however, about the degree of human input necessary to afford protection to AI-generated works. Accordingly, retailers using AI in the design process should ensure close human involvement.

While the use of AI tools may undermine the protectability of retailers’ intellectual property, it may also, conversely, subject retailers to a greater risk of infringement of those elements that are protectable. For example, generative AI models may be trained using third-party logos or other content or features, and as a result may produce content that infringes trademark, trade dress, copyright rights of publicity or other intellectual or proprietary rights. Given the widespread accessibility of such AI tools, retailers should closely monitor for potential infringement to protect their brand.

Antitrust concerns.
Regulators are increasingly interested in the potentially anti-competitive effects of AI pricing tools. Thus far, allegations of price fixing via the use of AI have largely focused on the real estate industry. In late 2023, the U.S. Department of Justice lent its support to a group of tenants alleging their landlords had submitted nonpublic supply, production and pricing information to RealPage, Inc., software, which was then used to set artificially inflated rent rates. Similar claims have since been filed by attorneys general in the District of Columbia and Arizona and by tenants in Washington.

In December 2023, the U.S. Senate Committee on the Judiciary convened a hearing to discuss AI’s impact on consumer and competition law, with a particular focus on price fixing, and in early 2024 a bill was introduced that would specifically prohibit algorithmic price setting. Accordingly, while retailers may not yet be in the crosshairs of antitrust regulators, AI pricing tools should be examined carefully in light of the growing regulatory interest in this space.

Since bursting on the scene in 2023, AI has caused many industries to think deeply about AI’s promises of efficiency and how best to deploy AI technology in their business. As retailers embrace the myriad of new opportunities presented by AI technologies, they should continue to be mindful of their compliance obligations to ensure that AI tools are implemented in a thoughtful, responsible and lawful manner.


Andrew Grant is a Partner and firmwide Chair of Perkins Coie’s Technology Transactions & Privacy Law practice, as well as its Outdoor and Internet and Ecommerce industry groups. Anna Stacey is an Associate in the firm’s Technology Transactions and Privacy practice. Both are based in the firm’s Seattle office.

Feature Your Byline

Submit an Executive ViewPoints.

Featured Event

Get free access to tactical tips, invaluable insights, and deep-dive conversations that will help you hone your strategies for Q4 and beyond. That way, you can be sure to be on shoppers’ nice lists this holiday season…and all year long.

Advertisement

Access The Media Kit

Interests:

Access Our Editorial Calendar




If you are downloading this on behalf of a client, please provide the company name and website information below: