Facebook on May 6 announced the first members of its oversight board, which it says will be an independent body that can overturn the company’s own content-moderation decisions.
In a press release, the social media and technology company, which has more than 2 billion users, said the members “reflect a wide range of views and experiences,” have lived in more than 27 countries, “speak at least 29 languages, and are all committed to the mission of the Oversight Board.”
“We expect them to make some decisions that we, at Facebook, will not always agree with—but that’s the point: they are truly autonomous in their exercise of independent judgment. We also expect that the board’s membership itself will face criticism. But its long-term success depends on it having members who bring different perspectives and expertise to bear,” the company said.
The members include lawyers, current and former journalists, rights advocates, and academics, with experience in areas such as internet censorship, platform transparency, digital rights, content moderation, press and religious freedom, and online safety.
The board will review some of the company’s most complex calls over whether to take down potentially harmful posts including graphic content, so-called “hate speech,” violence, and often polarizing posts on Facebook and Instagram. According to USA Today, the board will receive cases through a content management system that is linked to Facebook’s own platforms. They will then discuss the case as a group before issuing a final decision on whether the content should be allowed to stay up or not and will have 90 days to make a decision. For more urgent cases referred by Facebook, a 30-day expedited review will be conducted.
Among those on the board who all hold the title of co-chair and were selected by Facebook directly are Helle Thorning-Schmidt, former prime minister of Denmark; U.S. law professors Jamal Greene and Michael McConnell; and Catalina Botero Marino, dean of the Universidad de Los Andes Faculty of Law, who has served as special rapporteur for freedom of expression at the Organization of American States.
The panel also includes Alan Rusbridger, former editor-in-chief of The Guardian; Tawakkol Karman, a Nobel Peace Prize laureate who promoted nonviolent change in Yemen during the Arab Spring; and John Samples, vice president and founder of the Center for Representative Government at the Cato Institute who has written extensively about social media and speech regulation.
Others include law professor and former senior U.S. State Department lawyer Evelyn Aswad; Stanford law professor and U.S. Supreme Court advocate Pamela Karlan; Queensland University of Technology Law School professor Nicolas Suzor; Brazilian academic, technology and policy issues lawyer Ronaldo Lemos; director of Human Rights Watch’s Global Alliances and Partnerships program Maina Kiai; and former judge and vice president of the European Court of Human Rights András Sajó.
Making up the final board members are former editor-in-chief of the Jakarta Post, Endy Bayuni; digital rights and anti-censorship advocate Julie Owono; vice chancellor of the National Law School of India University, Sudhir Krishnaswamy, former national communications regulator in Taiwan, Katherine Chen; digital rights advocate with the Human Rights Tulip Award, Nighat Dad; human rights advocate with dual Ghanaian and South African citizenship, Afia Asantewaa Asare-Kyei; and former director general of the Israeli Ministry of Justice, Emi Palmor.
The members of the board, who Facebook said aren’t employees and can’t be removed by Facebook, will begin hearing cases in the coming months. It will eventually have around 40 members, at which point it alone will take responsibility for selecting members going forward, the technology company said.
Facebook acknowledged that the board “won’t be able to hear every case we or the public might want it to hear.”
“We know the board will play an increasingly important role in setting precedent and direction for content policy at Facebook. And in the long term, we hope its impact extends well beyond Facebook, and serves as a springboard for similar approaches to content governance in the online sphere,” the company said.
Facebook first announced its plans to launch an oversight board in November 2018, with CEO Mark Zuckerberg saying the move was an attempt to “create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding,” as “Facebook should not make so many important decisions about free expression and safety on our own.”
“I believe independence is important for a few reasons,” Zuckerberg said in a note posted to Facebook. “First, it will prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight. Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons.”