Responding to Critics, Facebook ‘Oversight Board’ to Monitor Content Decisions

Responding to Critics, Facebook ‘Oversight Board’ to Monitor Content Decisions
The social networking site Facebook is displayed on a laptop screen in London, England, on March 25, 2009. (Dan Kitwood/Getty Images)
Matthew Vadum

Facebook’s 2.4 billion monthly active users will soon be able to appeal takedown decisions to a new “oversight board” that its CEO once likened to a “Supreme Court” that will have the power to override the company’s own content-moderation decisions, Facebook announced Sept. 17.

“If someone disagrees with a decision we’ve made, they can appeal to us first, and soon, they will be able to further appeal this to the independent board,” Facebook CEO Mark Zuckerberg wrote in a letter, according to Ars Technica. “As an independent organization, we hope it gives people confidence that their views will be heard and that Facebook doesn’t have the ultimate power over their expression.”

The plan emerged after critics have complained for years that the Menlo Park, California-based company highhandedly engages in ideological viewpoint-discrimination, particularly against conservatives, often without clearly explaining the reasons for its decisions.

Facebook has made takedown decisions that have been intensely criticized. For example, in 2018, it removed the Declaration of Independence, claiming it was hate speech. The New York Times reported in December 2018 that Facebook’s content moderators used inaccurate and obsolete guidelines to decide whether to remove flagged posts.

President Donald Trump held a social media summit at the White House on July 11 to meet with conservatives upset about their views being censored in social media. Previously he wrote on Twitter that “Google & others are suppressing voices of Conservatives and hiding information and news that is good. They are controlling what we can & cannot see. This is a very serious situation-will be addressed!”

The U.S. Federal Trade Commission imposed a record $5 billion fine on Facebook in July for misleading users about personal data-privacy policies. The FTC and several states have also initiated antitrust investigations into Facebook’s business practices.

Lawmakers from both sides of the aisle have called for breaking up or regulating Facebook, and the company has been accused both of tolerating fake news and being too quick to censor posts.

Zuckerberg came up with the idea of a content moderation tribunal in 2018, according to Columbia Journalism Review.

At that time, he called for the creation of an independent body that would make rulings on some of the decisions the company makes about what content should be permitted on Facebook pages, decisions that regularly leave the company open to criticism.

Think about “some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech,” Zuckerberg said.

This week, Facebook published the charter document that will govern what it now calls an “oversight board,” which will adjudicate content-moderation disputes.

“Freedom of expression is a fundamental human right,” the charter states.

“Facebook seeks to give people a voice so we can connect, share ideas and experiences, and understand each other. Free expression is paramount, but there are times when speech can be at odds with authenticity, safety, privacy, and dignity. Some expression can endanger other people’s ability to express themselves freely. Therefore, it must be balanced against these considerations.”

The purpose of the oversight board “is to protect free expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Facebook’s content policies.” The board “will operate transparently and its reasoning will be explained clearly to the public, while respecting the privacy and confidentiality of the people who use” Facebook.

Facebook said the board will handle appeals from Facebook users and will eventually have 40 members who will serve three-year terms. Five-member panels will screen cases and decide which ones the board should consider. The board is expected to be operating by early 2020.

A trust will be created to compensate board members for their service, in order to give members independence from the company, according to Facebook. Until the board can be fully staffed, the company may temporarily delegate some of its staffers to fill positions.

All decisions rendered by the board will “be made publicly available and archived in a database of case decisions,” according to the company. A decision will be binding unless Facebook finds that carrying it out would violate the law.