Facebook Purges Accounts Linked to QAnon for ‘Inauthentic Behavior’

Facebook Purges Accounts Linked to QAnon for ‘Inauthentic Behavior’
A rallygoer holds up a cutout of the letter Q, in Lewis Center, Ohio, on Aug. 4, 2018. (Scott Olson/Getty Images)
Tom Ozimek
5/6/2020
Updated:
5/6/2020

Facebook announced that it deleted a wave of accounts, pages, and groups linked to the QAnon conspiracy theory.

The social media giant said May 5 in its monthly report on “coordinated inauthentic behavior” that the purge was because of violations of rules put in place to curb the use of the platform for disruptive or deceptive purposes.

“We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps,” Facebook stated, adding that when it finds domestic, nongovernment campaigns that it judges to be “seeking to mislead people about who they are and what they are doing while relying on fake accounts,” it purges both inauthentic and authentic accounts, as well as pages and groups that are involved.

The move, which is an outcome of April enforcement actions, focused on 20 Facebook accounts, six groups, and five pages related to QAnon.

People hold up smartphones with QAnon-related messages on display, at a rally in Las Vegas, on Feb. 21, 2020. (Mario Tama/Getty Images)
People hold up smartphones with QAnon-related messages on display, at a rally in Las Vegas, on Feb. 21, 2020. (Mario Tama/Getty Images)

While opinions vary as to its nature and intent, QAnon is a movement that started on 4chan and 8chan message boards with a trickle of clandestine-sounding posts, often centered on the theme of big government plots to curb individual liberties and advance so-called deep state and globalist agendas. It grew into a large underground movement with a number of splinter groups and often claims that members of the world’s social, economic, and political elites have engaged in child sex trafficking and cannibalism.

There are many theories on who is behind Q. Among the more common beliefs is that Q stands for Q security clearance, the highest level of clearance within the Department of Energy. Others believe it refers to “Q” from the James Bond films, a figure who supports Bond as he fights a global corrupt shadow group.

Fake Engagement

Facebook said the objectionable pages, groups, and accounts frequently posted about news and topics that included the upcoming presidential election and candidates as well as the current U.S. administration.

Facebook stated: “Our investigation linked this activity to individuals associated with the QAnon network known to spread fringe conspiracy theories. We found this activity as part of our internal investigations into suspected coordinated inauthentic behavior ahead of the 2020 election in the US.”

Coordinated inauthentic behavior, according to Nathaniel Gleicher, Facebook’s head of cybersecurity policy, is defined as fake engagement, spam, and artificial amplification, rather than the content itself being false.

“Coordinated inauthentic behavior is when groups of pages or people work together to mislead others about who they are or what they’re doing,” Gleicher said.

“When we take down one of these networks, it’s because of their deceptive behavior, it’s not because of the content they’re sharing,” he said.

“The posts themselves may not be false, and may not go against our community standards,” Gleicher added.

The move is part of a broader purge that covers a total of 732 Facebook accounts, 162 Instagram accounts, 793 pages, and 200 groups, the company stated.

Targeting ‘Misinformation’

Facebook has been widely targeting articles, posts, and events during the pandemic, trying to position itself as a neutral arbiter of “misinformation.” In March, Facebook displayed warnings to users on 40 million posts, removing hundreds of thousands it deemed harmful.

The company said last week it would start notifying people who interacted with harmful claims about the pandemic from China, utilizing the World Health Organization, which has been linked to the Chinese Communist Party, as a source.

Twitter, Google, and other technology platforms have taken similar steps to Facebook, worrying some experts.

“As a matter of public health, these moves are entirely prudent. But as a matter of free speech, the platforms’ unconstrained power to change the rules virtually overnight is deeply disconcerting,” Evelyn Douek, an affiliate at Harvard University’s Berkman Klein Center for Internet and Society, wrote in an article for The Atlantic.

“Unlike most countries’ emergency constitutions, those of major platforms have no checks or constraints. Are these emergency powers temporary? Will there be any oversight to ensure these powers are being exercised proportionately and evenhandedly? Are data being collected to assess the effectiveness of these measures or their cost to society, and will those data be available to independent researchers?”

Zachary Stieber and Joshua Philipp contributed to this report.