Can Amazon Fix the Problem of Fake Online Reviews?
If an online review system can be perfected, Amazon vows to get there first.
The e-commerce company is rolling out a series of changes to how different reviews are prioritized on its website, giving more prominence to newer reviews, those written by verified purchasers, and ones that are up-voted the most by other customers.
Before the change, an item’s rating in the five-star system was the average of all reviews. Now, the rating given by high quality reviews will influence the average rating more than low quality ones.
Thousands of new reviews are added to Amazon.com every day, and over the years it has accumulated more than 35 million reviews on its website, far too many to be individually vetted by human eyes. This has, predictably, made the review section vulnerable to a host of different types of abuse.
Authors have been caught time and time again penning, under pseudonyms, laudatory reviews of their own works, and trashing the output of their rivals. Companies have been caught having their own employees write 5-star reviews of their products, and Amazon sued Web mercenaries for selling fake Amazon reviews.
Amazon isn’t the only website vexed by abuses of its review system. Business owners on Yelp, a website that revolves around reviews, have had their stores pummeled by libelous remarks from rivals, and in 2013 a pair of Harvard researchers estimated that 16 percent reviews on Yelp were fraudulent.
If successfully implemented, Amazon’s reforms would lessen the severity of at least several types of abuses outlined above—or at least raise the cost of posting effective fake reviews.
One-sided reviews by users with a clear agenda against a book, an author, or even a coffee machine are usually easy to spot and down-vote, and a system of weighted reviews would neutralize the ability of those reviews to drag down the average rating of an item.
Giving more weight to reviews from verified purchasers also provides a line of defense against companies trying to conduct astro-turf marketing campaigns in the review section and against fake reviewers for hire, who will have to go through the added process of buying the product for themselves, products that are often far pricier than the fake review itself.
The Limits of Algorithms
However innovative Amazon’s current—and future—adjustments to the review section may be, the constraints inherent to the limits of machine intelligence means that there will always be loopholes, sometimes gaping ones, to an automated system.
Companies could make deeper investments into fake reviews, allowing their authors to pay for the products on an expense account. If Amazon writes an algorithm to detect strange shopping behavior—such as buying an entire line of similar coffee makers from a single company—the fake reviewer could be tasked to purchase office supplies to dupe such criteria.
Despite all the promises of artificial intelligence, even the most cutting-edge machine minds are still highly primitive in emulating the common sense that most humans take for granted. Watson, the IBM computer that managed to beat former Jeopardy! champions in an episode of the quiz show in 2011, still made remarkably buffoonish answers on the path to victory: when asked what grasshoppers eat, it said “Kosher.”
At the moment, readers of online reviews will still have to believe in the authenticity of online reviews at their own discretion. Despite the proliferation of fake entries, reviews are slowly becoming an indispensable tool to the American consumer. A 2014 study by marketing agency BrightLocal found that 88 percent of Americans trusted, “under the right circumstances,” online reviews as much as a personal recommendation from a friend.