🔥NEW TOOLS: Overstock Dashboard for customers and FBA Size Tiers Optimizer free for all

Try For Free

FTC Proposes a New Rule to Rein In Fake Reviews

Update 10/17/2023: An alliance of cross-industry leaders, including Amazon, is coming together to ensure the integrity of online reviews. 

Coalition for Trusted Reviews

Amazon is collaborating with Booking.com, Expedia Group, Glassdoor, Tripadvisor, and Trustpilot to introduce the first global Coalition for Trusted Reviews. Their goal is to:

  • Establish industry standards for detecting fake reviews
  • Promote best practices in managing online reviews
  • Facilitate the exchange of intelligence concerning deceptive activities by entities involved in the sale of fraudulent reviews
  • Safeguard consumer access to reliable information on their respective platforms

For quite some time, online marketplaces have grappled with the persistent issue of bogus reviews, despite their ongoing attempts to eliminate it.

A significant portion of this problem can be attributed to fake review brokers who, in pursuit of monetary gain, freebies, or various incentives, actively seek fabricated customer reviews via social media and encrypted messaging apps. These brokers engage in both promoting fake positive reviews to enhance business and seller sales and orchestrating negative reviews to detrimentally affect their competitors’ sales and performance.

Hence why, Becky Foley, VP of Trust & Safety at Tripadvisor, has emphasized a dedicated commitment to targeting individuals attempting to sell phony reviews to businesses seeking to artificially boost their star ratings and online reputations. This mission of eradicating such practices takes immediate focus for the coalition.

“These actors often operate outside of jurisdictions with a legal framework to shut down fraudulent activity, making robust cooperation even more important,” she said in a press statement.

The formation of the trusted reviews coalition stems from discussions that emerged during a conference on “Fake Reviews” arranged by Tripadvisor in San Francisco last year. The companies have confirmed their intention to convene once again, with plans to meet early in December at a follow-up conference hosted by Amazon, scheduled to take place in Brussels.

Minimizing Regulatory Risk

While the timing may have nothing to do with FTC’s recent fake reviews rule proposal, it is interesting to see how Amazon’s actions prior to and on the heels of government intervention seem to correlate.

The founding of the coalition could be seen as way of to minimize regulatory risk that the pending rule on fake reviews may cause if passed into law.

Regulatory risk pertains to the potential danger posed by the introduction of new laws, regulations, or amendments to existing ones, which could result in companies falling out of compliance with their obligations. This non-compliance may lead to financial burdens, decreased profitability, business loss, or other detrimental effects on their functions, reputation, or financial performance.

By keeping abreast of evolving legal frameworks, guidelines, and rules while actively monitoring regulatory activities, Amazon can take a proactive approach to assess how emerging changes might introduce new risks or require policy adjustments.

In response, the company can implement measures to minimize these risks, control expenses, and secure ongoing compliance.

On June 30, 2023, the Federal Trade Commission (FTC) published a new proposed “Rule on the Use of Consumer Reviews and Testimonials,” or Part 465, which aims to illegalize certain customer review practices and authorize courts to impose a civil penalty of up to $50,000 per violation.

Factors that Compelled the FTC to Take Action

The success of a product frequently hangs in the balance of online reviews, as retailers and search engines prominently display them. This is done to aid customers in making smart decisions, but recent studies suggest that some of these reviews might be deceptive and misleading.

In fact, unreliable reviews have become so prevalent that 85% of customers believe what they read online is sometimes or often fake or fraudulent

On Amazon.com alone, 42% of 720 million reviews analyzed by Fakespot were deemed fake in 2020. Many of the bogus reviews are the work of AI bots, strategically aimed at manipulating product ratings. Additionally, some third-party sellers have reportedly resorted to unethical practices, such as incentivizing positive reviews through cash payments or free or discounted products.

Even though using shady review tactics is against Amazon’s product review policy, it helps bad actors influence customers to buy their products over competitors, according to this 2022 study.

Conducted in the UK, 10,000 shoppers were presented with five identical products.

During the experiment, certain participants were exposed to fake reviews, inflated star ratings, or a combination of both. The insightful findings revealed that these unreliable testimonies had an impact on consumers’ wallets, causing them to spend an additional $0.12 for every dollar spent.

The study also found that individuals exposed to such deceptive information were six percentage points more likely to end up purchasing a flawed product.

Notably, the influence of star ratings was also brought to light. The paper disclosed that a mere one-star increase in a product’s rating led to a substantial surge in demand, driving it up by 38%.

Lastly, it was revealed that providing people with warnings and educational sources on the topic of review manipulation, though not thoroughly effective, could potentially curtail the adverse effects by 44%.

This study sheds light on the importance of transparent and genuine feedback in the decision-making process of consumers, as fake reviews undercut honest businesses, tarnish brands, erode trust between sellers, customers and platforms, and overall lead to bad shopping experiences – all of which can have emotional and economic repercussions.

As for the efforts marketplace platforms take in policing these reviews, the FTC itself received comments from three individuals dedicated to fighting fake reviews – the Transparency Company, Fake Review Watch, and Fakespot. All three commenters claimed that “the strategies that are currently being used by review platforms are insufficient.”

All this contributed to FTC’s move to take action, in an effort to finally capture a wider swath of malicious customer review practices in the US.

FTC’s Efforts to Address Rampant Fake Reviews

The newly added Part 465 has undergone an extensive development process, as is customary for any federal regulatory body.

In 2019, FTC pursued legal action against a merchant for disseminating deceptive information and engaging in the purchase of counterfeit reviews. Prior to that, the FTC had also addressed the issue of “influencer marketing,” where endorsers failed to disclose their financial ties to products they were promoting.

Now, the agency is poised to implement a comprehensive provision based on rules initially presented in November 2022.

Part 465 is the culmination of years-long research and extensive dialogue with various stakeholders, including businesses, consumers, and advertising trade organizations. 

Interestingly, despite Trustpilot not supporting the rulemaking and certain trade groups urging the FTC against rigorous enforcement on this thriving fake review business, the agency remains resolute in its pursuit.

Product Review Practices Prohibited Under the New Proposed Rule

FTC’s proposed rule would stop businesses or sellers from utilizing the following malicious review and endorsement methods.

  • Fake customer reviews, customer testimonials, or celebrity testimonials. Engaging in the fabrication, production, sale, purchase or solicitation of reviews by a reviewer who meets any of the following criteria is considered a violation.

    The reviewer does not exist. For instance, bad actors using automated buyer accounts or bots to boost positive reviews for their clients and downvote positive reviews for their rivals.

    The reviewer did not use or have any genuine experience with the product, service, or business being reviewed or endorsed. For example, using AI chatbots like ChatGPT to generate fake glowing reviews and then creating multiple dummy accounts to be able to spam a listing with those reviews.

    The reviewer significantly distorts their experience with the product, service, or business being reviewed. Additionally, the rule would also restrict people from acquiring reviews or spreading such testimonials if they were aware, or should have been aware, of their fraudulent or deceptive nature.
  • Honest negative review suppression. Making genuine negative reviews disappear or threatening individuals to prevent or delete their poor feedback is called review suppression. FTC has recently undertaken its first case targeting a company’s deceptive practice of withholding negative reviews, and it has now come to a $4.2 million settlement with Fashion Nova, LLC, a fast-fashion retailer based in California.

    Per FTC, Fashion Nova purportedly integrated a third-party review management system that enabled the company to post specific reviews automatically, while holding back others pending their approval. The fashion retailer allegedly employed this system between 2015 and 2019 to instantly publish favorable four and five-star reviews, while deliberately choosing not to display any of the numerous reviews that rated below four stars. This means Fashion Nova misrepresented the opinions of all customers who contributed feedback on its website.
  • Review hijacking. Businesses would be prohibited from using or repurposing an existing listing or product page (often with excellent reviews), then updating the product’s details with those of a considerably distinct product.

    For example, the Variation Relationship feature on Amazon organizes reviews from diverse product variations into a unified collection. Certain product offerings have multiple options, such as various shapes, sizes, colors, and more. However, Amazon only allows one consolidated set of reviews to cover all these distinct variations. 

    Imagine a selection of 4 different colors of Apple Watch Series 8, each available in aluminum finishes, along with an additional choice of 2 different wrist sizes. Despite these numerous options, customers will provide reviews for the watch as a whole, rather than separately for each variation. This ensures a comprehensive assessment of the product’s overall quality and performance, but certain sellers have found a way to exploit this particular feature.

    Their intention is to artificially inflate the number of reviews for their products. To achieve this, they resort to hijacking unrelated listings and incorporating them as product variations. As a result, the reviews from these stolen listings get merged with the genuine reviews of their own product, leading to a deceptive increase in the product’s overall rating. While these reviews might indeed be authentic and written by real individuals, they were originally intended for a completely different product altogether.
  • Biased reviews from insiders like employees. The forthcoming regulation aims to impose restrictions on corporate execs and managers, preventing them from authoring reviews or testimonials for the products or services offered by their own company, unless they explicitly disclose their affiliations. It also seeks to disallow businesses from promoting testimonials provided by internal personnel without transparently revealing any existing relationships.

    Additionally, the proposed rule will address specific instances where company officers or managers seek reviews from their employees or relatives, and whether the businesses were aware or should have been aware of such connections, to determine whether such solicitations are permissible.
  • Company-controlled review websites. An outright prohibition would be imposed on businesses like Yelp, TrustAdvisor, and Trustpilot aiming to establish or exercise control over websites claiming to offer unbiased viewpoints concerning a specific category of products or services, which coincidentally include their own offerings. 
  • Selling fake social media indicators. For example, businesses that sell or buy fake followers or views to gain prominence or “misrepresent their importance for a commercial purpose.”
  • Incentivized positive or negative reviews. This approach deceives shoppers looking for authentic feedback on a product or service and undercut honest sellers. Amazon itself updated its Community Guidelines to ban this malicious practice in 2016, while making an exception for its Vine program.

Companies Subject to the Proposed Rule on Fake Reviews

The new proposed rule would encompass all businesses, defined broadly as “an individual, partnership, corporation, or any other commercial entity that sells product or services.” 

However, internet service providers like Amazon, Google, Yelp, Tripadvisor, and social media sites where thousands of fake review brokers recruit reviewers may not be directly held accountable, the Washington Post reports.

That’s because Section 230 of the Communications Decency Act states:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Therefore, these companies may opt to assert immunity under Section 230, making it difficult for the Commision to file complaints against them.

Per FTC, “Amazon did not state support for or opposition to the rulemaking.” The retailer already has existing policies and initiatives to combat fake reviews and counterfeiters on its site.

In 2021, Amazon invested over $900 million and employed a dedicated workforce of more than 12,000 individuals in safeguarding its customers and store from fraud and abuse. Notably, in 2022, Amazon proactively put an end to more than 200 million suspected fake reviews.

Amazon also reported the existence of over 16,000 social media groups involved in buying or trading misleading reviews. Consequently, social media sites such as Facebook, Twitter, and Instagram removed groups that had amassed over 11 million members. In response to this issue, Amazon took legal action and filed a lawsuit against more than 10,000 Facebook groups in 2022.

These efforts and FTC’s proposed rule may not be enough to completely wipe fake reviews off of Amazon and other platforms, but they do hold the potential to bring about some much-needed relief.

Related: A Purge Could Be Coming For Fake Reviews on Amazon, Amazon Highlights ‘Frequently Returned’ Products You Should Think Twice Before Buying, Amazon Hopes to Restore Consumer Confidence with $1.2B Anti-Counterfeit Initiative

Try For Free 1,000+ Customers. Free Data Migration. 1-on-1 Onboarding.

Need more information?

  1. Send Message: We typically reply within 2 hours during office hours.
  2. Schedule Demo: Dive deeper into the nuances of our software with Chelsea.
  3. Join Live Upcoming Webinar: New to Amazon inventory management? Learn three inventory techniques you can implement right away.

Give a Comment