The Social Graph Won’t Save Us From What’s Wrong With Online Reviews

It may very well make shills of us all.
The Social Graph Won’t Save Us From What’s Wrong With Online Reviews
Sure you’re connected to them, but can you trust them? (Michael Sean Gallagher, CC BY-SA)
5/4/2015
Updated:
5/26/2015

In early 2015, the Belfast Telegraph sent reporter Kim Kelly undercover to visit Northern Ireland’s “worst” hotel—according to its on online reputation. Kelly reported that although some TripAdvisor reviews had called it a “hell hole” and “dustbin,” she was pleasantly surprised with the “clean and compact” rooms.

This story is indicative of how important online reviews have become and the skepticism many have toward them. In a 2014 survey of Americans by the market research firm YouGov, 90 percent of respondents said that checking online reviews was an important part of shopping. An equal percentage believed that such reviews are sometimes manipulated—for motives not difficult to discern.

Online Reviews Translate to Big Bucks

As I document in my new book “Reading the Comments: Likers, Haters, and Manipulators at the Bottom of the Web”, online reviews affect merchants’ bottom lines. Multiple studies have shown that good reviews permit merchants to charge higher prices, increase restaurant bookings, and sales of books, hotel rooms, and video games. Accordingly, review platforms are worth millions. TripAdvisor 2011’s IPO was valued at over U.S.$3 billion. In 2013, Amazon purchased Goodreads, the book review and discussion site, for $150 million. Google had its own review acquisition spree, including Zagat in 2011 and Frommer’s Travel in 2012.

Beyond consumers, merchants, and review platforms, there’s another actor keen on benefiting from online reviews: illicit manipulators. From overseas “sweatshops“ (that earn pennies per post) to the “boutique” reputation services for the rich, there is a vast market for online deceit. By finding patterns in posts (such as the ratio of positive to negative words) and activity (such as a negative review quickly followed by a positive one), studies estimate that 10–30 percent of reviews are fake. Similarly, Yelp discloses that about a quarter of its reviews are “filtered“ as unreliable—they are not easily seen and are excluded from services’ average number of stars.

Like it? Loyal? What's in it for me? (Search Influence, CC BY-NC-SA)
Like it? Loyal? What's in it for me? (Search Influence, CC BY-NC-SA)

Social Steps In

What to do? Some suggest the “social graph” as a solution: favoring the comments and activities of users’ friends over the recommendations of strangers. Instead of reading an anonymous review of an eatery, you are informed that your friend Alice enjoyed her sandwich there.

But this solution assumes that platforms themselves (and your buddy Alice) can be trusted. Review sites including Yelp have been accused of extorting merchants by rigging which reviews are seen depending on whether merchants paid for advertising—especially when Yelp’s own employees write a bad review after a merchant declines to advertise. So far, Yelp has prevailed in the courts and review platforms will continue to profit by manipulating the visibility of users’ praise and pillory.

I don't like it. (Sean MacEntee, CC BY)
I don't like it. (Sean MacEntee, CC BY)

Facebook was accused of abusing both its advertisers and end users with its Sponsored Stories program, which it described as “messages coming from friends about them engaging with a Page, app, or event that a business, organization, or individual has paid to highlight so there’s a better chance people see them.” Businesses that had Page accounts were upset when they found they were having a more difficult time reaching their fans: on average, only 15 percent saw these Sponsored Story messages. If clients wished to reach more fans, they needed to pay for more Sponsored Stories; many complaints soon followed.

Worse yet, end users were surprised to find themselves appearing in Facebook ads. A plaintiff in a lawsuit against Facebook appeared in ads because she had “liked” an online French language course in hopes of getting a discount. Facebook is not alone. Google+ has “shared endorsements“ and, in addition to its ”promoted tweets,” Twitter will reportedly start filtering and shaping its users’ timelines later in 2015.

Geek & Poke by Oliver Widder. (geek-and-poke.com/CC BY)
Geek & Poke by Oliver Widder. (geek-and-poke.com/CC BY)

Additionally, moving to the social graph is only likely to implicate one’s acquaintances in the same game. Users, too, are tempted to exploit the social graph. When our friend Alice posted that she enjoyed the sandwich, perhaps it was because she also got a free drink for doing so? Shoppers click “like” in hopes of a discount and recommend products to acquaintances so as to get a referral fee. Much of this is driven by the extraordinary value of review today, the rapacious desire to rate and rank everything, the consequent dynamic of competition and the sense that everyone else is already doing it.

The social graph will not save us, it may very well make shills of us all.

Joseph Reagle is assistant professor of digital communications at Northeastern University. This article was originally published on TheConversation.com.