Facebook Whistleblower Accuses Platform of ‘Fanning Ethnic Violence’ in Myanmar and Ethiopia

Facebook Whistleblower Accuses Platform of ‘Fanning Ethnic Violence’ in Myanmar and Ethiopia
Former Facebook employee Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled 'Protecting Kids Online: Testimony from a Facebook Whistleblower' on Capitol Hill in Washington, D.C., Oct. 5, 2021. (Jabin Botsford-Pool/Getty Images)
Isabel van Brugen
10/6/2021
Updated:
10/6/2021

Facebook whistleblower Frances Haugen during Tuesday’s Senate hearing cited ethnic violence in Myanmar and Ethiopia as examples of the “destructive impact” that the social media platform has had on society.

The former Facebook employee suggested before lawmakers that there is a link between Facebook activity and the violence in the regions. The social media platform’s algorithms facilitate hate, Haugen said, and therefore put profit before user safety.

“My fear is that without action, divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar and now in Ethiopia are the opening chapters of a story so terrifying no one wants to read the end of it,” Haugen, a former product manager for Facebook, said before the Subcommittee on Consumer Protection, Product Safety, and Data Security.

Facebook officials didn’t immediately respond to a request by The Epoch Times for comment.

In 2018, Facebook officials said that the company hadn’t done enough to limit the spread of posts fuelling violence against Myanmar’s persecuted Rohingya minority, but earlier this year, the company pledged to curb the spread of misinformation following recent bloodshed and a military coup in the country.

Haugen suggested that to prevent the viral spread of content and misinformation, something she said could fuel repressive actions in such countries, Congress could make changes to Section 230 of the Communications Decency Act, which protects online platforms from being held responsible for content posted by third parties.

The change would make Facebook “responsible for the consequences of their intentional ranking decisions,” she said.

“I encourage reform of these platforms, not picking and choosing individual ideas, but instead making the ideas safer, less twitchy, less viral, because that is how we scalably solve these problems,” Haugen said, criticizing the platform’s engagement-based ranking as “literally fanning ethnic violence” in countries such as Myanmar and Ethiopia.

“Facebook also knows, they have admitted in public, that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems to most of the languages in the world. And that’s what is causing things like ethnic violence in Ethiopia,” she added.

Responding immediately after Haugen’s testimony, Facebook suggested that Haugen’s credibility was in question, saying she was “a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives.”

“We don’t agree with her characterization of the many issues she testified about. Despite this, we agree on one thing; it’s time to begin to create standard rules for the internet,” said Lena Pietsch, Facebook’s director of policy communications.