Advertisement

There’s a Plague of Fake AI Reviews Coming – are Retailers Ready?

profit_image-stock.Adobe.com

Nearly 95% of consumers check online reviews before purchasing a product or service. They provide a valuable signal of trust for buyers based on other people’s firsthand experiences, and sellers rely on them to boost their reputations and sales in the crowded online marketplace.

However, a new trend threatens to undermine this. Fake reviews generated by artificial intelligence (AI) tools like ChatGPT are rapidly increasing and could soon spiral out of control.

To understand the extent of the issue and how big it might become, Pangram Labs recently carried out a large-scale study across best-selling Amazon product categories. It found widespread evidence of AI-generated reviews being posted, with a large percentage being 5-star and receiving the “verified purchase” stamp.

The Growing Problem of AI-Generated Reviews

The study analyzed nearly 30,000 customer reviews across 500 best-selling products in top categories like beauty, baby, health, appliances and furniture on Amazon.com.

Advertisement

Using our robust AI detector tool, 909 AI-generated reviews were found, which is 3% of the total reviews studied. In baby products, beauty and wellness categories, a notable percentage (roughly 5%) of reviews were found to be generated by AI.

While this may seem low, the finding should be taken as an early warning by retailers, platforms and consumers alike. With access to and use of AI tools like ChatGPT, Gemini and Anthropic growing exponentially, it will soon be hard to trust any “user-generated” content on the web, including product reviews.

AI-generated reviews are especially likely to be 5-star ratings, and many have a “verified purchase” stamp, which helps make them appear legitimate to consumers. Of the 500 Amazon best-sellers we analyzed, 74% of AI-written reviews gave products a 5-star rating. This compares to 59% of legitimate human reviews. The reverse is also true; 1-star reviews are more likely to be human-written than AI-generated, 10% vs. 22%.

Sellers may intentionally use AI tools to write positive reviews for their products to artificially inflate their star ratings and attract more buyers.

The Role of Regulators in Addressing the Issue

To protect online reviews and maintain consumer trust, regulators and retailers globally must make changes now.

Reassuringly, the risks of fake reviews are slowly being acknowledged and addressed, as the Federal Trade Commission (FTC) has banned fake reviews, which includes AI-written ones, and has the right to financially penalize organizations in breach of this.

But in the UK, while the Competition and Markets Authority (CMA) has begun enforcing regulations against AI under the Digital Markets, Competition and Consumers Act 2024, it’s difficult to enforce directly, as “fake reviews” are not specified under the act. This lack of clarity could allow AI to continue affecting the trustworthiness of reviews online.

Regulators must do more to define and tackle AI-generated reviews as an issue separate from fake reviews. Legislation needs updating for the new AI era to keep up with the pace of change.

The Need for New Tools and Approaches by Retailers

Retailers and platforms have an important role to play too. Amazon and other influential ecommerce sites need to acknowledge the problem and make changes on their sites to enforce more effective measures to prevent fake and AI reviews.

To its credit, Amazon has been trying to address AI-created reviews, but based on what the study found, their efforts aren’t working well enough. The volume of AI-generated reviews slipping through the cracks suggests Amazon still has work to do.

As AI tools become increasingly accessible, the issue of fake reviews will continue to grow and worsen. These models can produce convincing human-like reviews in seconds, making it difficult for consumers to know what is real and artificial. This undermines the purpose of product reviews, which is to provide honest and unfiltered opinions from real users.

The findings of our study also call into question the usefulness of Amazon’s new AI-generated customer review highlights, introduced in 2023. This latest innovation provides a short paragraph on the product detail page for customers to understand at a glance the common themes across hundreds or even thousands of published reviews, including product features and buyer sentiment. If the percentage of phony AI reviews increases on product pages, the AI overview becomes essentially useless.

To protect online shoppers and the integrity of the customer review system, retailers could consider introducing AI detector technology as part of the review posting process. This would flag AI-generated content before it is published. They must also act fast to respond to any concerns raised by customers or competitors about individual sellers’ reviews.

Consumers themselves also play a key role in making changes. If they are leaving a review, they should avoid using AI tools to write it for them. Honest reviews are essential for maintaining trust and ensuring that purchasing decisions are made based on factual and honest information, every time.


Max Spero, CEO of Pangram Labs, is a seasoned machine learning engineer. Before co-founding Pangram Labs, he most recently worked on autonomous vehicles at Nuro, leading their active learning effort. He has a long history of deploying successful machine learning products at Google, Two Sigma, and Yelp. Spero holds a B.S. in theoretical computer science and an M.S. in artificial intelligence from Stanford University.

Feature Your Byline

Submit an Executive ViewPoints.

Featured Experience

Get ready for the holidays with the Holiday ThinkTank! Find must-read articles, webinars, videos, and expert tips on everything from trends to marketing, in-store ideas, ecomm, fulfillment, and customer service. It’s all free and available anytime—so you can plan, prep, and win the season your way.

Advertisement

Access The Media Kit

Interests:

Access Our Editorial Calendar




If you are downloading this on behalf of a client, please provide the company name and website information below: