Google Search ‘Filter Bubbles’ Politically Divide Americans, Study Says

Google Search ‘Filter Bubbles’ Politically Divide Americans, Study Says
An image of the Google logo is reflected on the eye of a man in London on Aug. 9, 2017. (Leon Neal/Getty Images)
Petr Svab
12/17/2018
Updated:
12/17/2018

Google’s personalized search results are isolating people in bubbles of their own political preferences, making it harder for voters to make informed decisions on contentious issues, according to a study by DuckDuckGo, a company that runs the privacy-oriented search engine DuckDuckGo.com.

The company had 87 volunteers across the country conduct the same series of searches on Google within about one hour. As expected, nearly all were shown significantly different results.

“These editorialized results are informed by the personal information Google has on you (like your search, browsing, and purchase history), and puts [sic] you in a bubble based on what Google’s algorithms think you’re most likely to click on,” stated the report from the study published on Dec. 4.
But even when the study participants were logged out of Google accounts, and with heightened privacy settings in their browsers, they couldn’t escape Google’s algorithms, demonstrating the “filter bubbles” might be much harder to burst than previously thought.

Bubble Polarization

The volunteers searched for terms related to politically contentious issues, such as “gun control,” “immigration,” and “vaccination.”

“The filter bubble is particularly pernicious when searching for political topics,” the report stated, adding, “Undecided and inquisitive voters turn to search engines to conduct basic research on candidates and issues in the critical time when they are forming their opinions on them.”

The bubble thus worsens polarization in society, according to DuckDuckGo founder and CEO Gabriel Weinberg.

“You’re getting a viewpoint that you’re already more likely to agree with that’s pushing you toward your preexisting beliefs so you don’t really consider the other candidate or other side of the issue as much as you should in your research mode,” he said in a phone interview.

Americans have grown more divided along partisan lines over the past two decades, a 2014 Pew Research Center study showed. But Weinberg wouldn’t go as far as blaming Google, saying more research would be needed to prove more than a mere correlation.

Google Power

The impact of filter bubbles on “political outcomes in aggregate” can be “significant,” the study stated.

Google’s algorithms can shift 20 percent or more of votes among voters and up to 80 percent in some demographic groups, according to research by Robert Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology. Google representatives stated the company didn’t agree with Epstein’s research methodology.

Excluding mobile searches, 76 of the study participants were shown 62 different sets of results when they searched “gun control” logged out and browsing in privacy mode, the study found.

Even if people saw the same sources in the search results, they commonly saw them in a different order. That’s more important than it might seem. People click on the top result about twice as often as on the second, while the click rate similarly halves for the third and each subsequent result, according to Hugh Williams, search engine researcher and former executive at tech companies including Google, eBay, and Microsoft.

The top two or three results were usually the same for all the DuckDuckGo study participants, meaning most people would end up clicking on the same links.

But Weinberg argued the variation in the lower results still matters, despite them getting much fewer clicks. Indeed, the clicks add up, given how prolifically people use Google. Some 5 billion to 8 billion searches go through Google every day, based on several different estimates (Google seldom releases its search statistics).

Area Bubble

Google denies personalizing search results for people with browser privacy mode on based on the users’ Google account search history.

From Google’s response to the study, Weinberg gathered “that location is largely driving” the differences.

That suggests the bubble may not be personal but includes a group of people accessing the web from the same area.

“If that’s the case, then you would be consistently showing different types of links to different zip codes, to different locations, and that would create a persistent filter bubble effect,” Weinberg said.

The study authors were aware that Google may understandably personalize searches based on location by including local news media or other local information sources. The study was thus designed to control for such local sources, even though few of them ended up among the results.

Shadow Profiles

With all he’s seen, Weinberg suspects that Google goes beyond creating area-based bubbles. The location information used by Google and other web services stems from data like the IP address and the “browser fingerprints,” which include information like browser type and version, the user device’s operating system, time zone, language, and various browser and device settings.
Such data may seem far from personal, but an Electronic Frontier Foundation (EFF) study found that “at best” only two in nearly 287,000 browsers share the same fingerprints (pdf). Combined with the IP, the study found that the fingerprints would be enough to identify a specific device “in all but a tiny number of cases.” And since people typically use their own devices, Google may be able to link the device with a specific user.

Tracking users based on the IP and fingerprints, even when they apparently don’t wish to be tracked, has been dubbed “shadow profiling” and can be hard to detect because it doesn’t leave a trace on the user’s device, according to the EFF.

Shadow profiling may be a way for Google to maintain deniability over search personalization. If Google personalizes search results based on a profile build using the IP and browser fingerprints and if it keeps such a profile separate from the user’s Google account, the company may try to claim the search results are not really “personalized” because they’re not connected to a specific person, Weinberg speculated.

“You could semantically call that ‘not personal,’ ” he said.

A Google spokesperson said the DuckDuckGo study’s “methodology and conclusions are flawed since they are based on the assumption that any difference in search results are based on personalization” and that “there are a number of factors that can lead to slight differences, including time and location, which this study doesn’t appear to have controlled for effectively.”

DuckDuckGo

Weinberg’s search engine makes privacy its selling point, saying it doesn’t collect any personal information. For search business, personal information is not needed, he said, because advertisers target keywords which users themselves type into the search field. That’s why the privacy creed hasn’t financially sunk the company. In fact, it’s been profitable since 2014, Weinberg said.
It also appears the pitch is working. The engine recorded over 31 million average daily direct searches in November, an increase of nearly 50 percent since January. Even at this pace, however, it would take the company until the 2030s to beat current Google numbers.