Facebook Engineer Quits, Burned Out by ‘Political Monoculture,’ Content Policing Direction

Facebook Engineer Quits, Burned Out by ‘Political Monoculture,’ Content Policing Direction
(L) Former Facebook engineer Brian Amerige; (R) Employees have lunch at the canteen at Facebook's new headquarters in central London on Dec. 4, 2017. (Courtesy of Brian Amerige; Daniel Leal-Olivas/AFP/Getty Images)
Petr Svab
10/17/2018
Updated:
10/17/2018
Friday, Oct. 12, was Brian Amerige’s last day at Facebook after nearly seven years. “This was a difficult decision to make because I love so much about this company, our mission, and our leaders,” he said in a goodbye memo later leaked to the press.

He told his colleagues they had a lot to be proud of. He praised “the density of talent at Facebook.” He said his “teams have always felt like family.”

However, despite the positives, he saw no option but to leave.

“I left because I had disagreements with senior Facebook leadership about the content policy direction,” he told The Epoch Times via Facebook’s Messenger app.

He said at the core of the content policy disagreement was the issue of “hate speech” tied to the company’s entrenched leftist, or progressive, culture.

Just as with the other major online platforms—Google, Youtube, Twitter—Facebook has adopted a stance that “hate speech” can’t be tolerated on its site.

But Amerige said he’s seen first hand that this policy is not only impractical, but also dangerous.

“Hate speech can’t be defined consistently and it can’t be implemented reliably, so it ends up being a series of one-off ‘pragmatic’ decisions,” he said. “I think it’s a serious strategic misstep for a company whose product’s primary value is as a tool for free expression.”

Amerige has occupied various positions at the company, most recently as a senior engineer, and over the years saw how the focus on policing hate speech sent the company’s content policies into a tailspin.

Former Facebook engineer Brian Amerige. (Courtesy of Brian Amerige)
Former Facebook engineer Brian Amerige. (Courtesy of Brian Amerige)
To determine whether somebody’s words are hateful requires determining the speakers’ intentions—”something that’s notoriously hard to grok from the outside, as we’ve each surely experienced in our communication with friends and family,” Amerige argued in an Oct. 17 Facebook post.

“I watched the teams that implement our content policy get built, and routinely spoke with the teams to give them feedback on it,” he told The Epoch Times.

The management was at first willing to listen to his concerns, but eventually, he reached an impasse.

“Hate speech was the core issue, though there were definitely others as well,” he said.

Monoculture

His broader issue was with Facebook’s internal culture, which he described as “politically monocultural.”

When it came to issues like social justice, immigration, diversity, and equality, only the progressive opinion was tolerated and disagreement was quick to attract labels ending with “phobe” or “ist.”

“Any conversations around the hate speech policy, for example, were often met with hostility and character attacks, rather than engagement with the ideas at hand,” he said.

This culture then led to further problems.

Facebook has over 30,000 employees. About 10,000 of them work on “safety” or “security” and that should double next year, Amerige said. Nearly half of these workers are content reviewers, he estimated.

Because the hate speech policies are “necessarily opaque” he said, the content reviewers brought to the table “their own decidedly uninformed understanding of the policy.”

That would not only lead to mistakes, but invited political bias.

Room for Bias

“There’s no intentional filtering of conservative perspectives, but many of the people in these roles aren’t aware of what non-left-leaning perspectives even are—they just aren’t exposed to them,” Amerige said.

He saw “the process run off the rails time and time again” saying the “Facebook’s community standards are chaotically, almost randomly, enforced, with escalations and occasional reversals happening when the screw-ups are prominent enough to cause media attention.”

In August, Facebook severely penalized PragerU, a nonprofit that produces conservative educational videos, only to remove the penalties and apologize a day later after saying the sanctions were a “mistake.”

Diamond and Silk, online personalities and supporters of President Donald Trump, had their Facebook account restricted and were told their content was “determined unsafe to the community.” Facebook then acknowledged that characterization was “inaccurate.”

Brandon Straka, a former supporter of President Barack Obama who launched a movement to “#WalkAway” from the Democratic Party, had his Facebook account blocked after he posted a link to his being interviewed by InfoWars. Facebook suspended InfoWars pages in August, as did Google, Apple, and Spotify, citing violations of hate speech and graphic violence policies, but it didn’t prohibit users from posting links to InfoWars content that didn’t itself violate Facebook rules. Facebook reversed the move against Straka, calling it a “mistake.”

While users affected by faulty content policing can appeal, the redress process is “approximately as inconsistent as the initial enforcement,” Amerige said.

He tried to spark a change from inside, starting an internal group “FB’ers for Political Diversity” that has attracted about 750 members. His complaints attracted some media coverage too. But in the end, he felt burned out by having to constantly fight over the issues he saw. And so decided to quit.

Former Facebook engineer Brian Amerige. (Courtesy of Brian Amerige)
Former Facebook engineer Brian Amerige. (Courtesy of Brian Amerige)

“I needed to leave to act in accordance with my values, and while I’m proud of the decision, it’s also a deeply disappointing turn of events,” he said. “I wish the company the best and I want to see Facebook work through this. I’m rooting for it.”

Facebook didn’t respond to a request for comment.