Conservatives Say Facebook Needs ‘Significant Work’ to Address Concerns: Former Senator

Conservatives Say Facebook Needs ‘Significant Work’ to Address Concerns: Former Senator
Small toy figures are seen in front of Facebook logo in this illustration picture, April 8, 2019. (Dado Ruvic/Illustration/Reuters)
Reuters
8/20/2019
Updated:
8/20/2019

WASHINGTON—A review by a former Republican U.S. senator concludes that political conservatives believe Facebook Inc. needs to do “significant work” to satisfy concerns about bias by the social media website, describing policies and examples that people found problematic without laying out evidence of systemic partisanship.

The report by former Sen. Jon Kyl, commissioned by Facebook and released on Aug. 20, found in interviews with about 133 political conservatives that many oppose Facebook policies they believe undermine free speech by conservatives, such as bans on “hate speech.”

It’s the latest effort by Facebook to address rising anger among Republicans over alleged conservative bias as some lawmakers call for legislation that would revoke the liability shield big tech companies have for content posted by users.

They also pointed to anecdotal examples of what they call unfair treatment of conservative viewpoints, such as unjustified removal of language from the Bible, which they suggest are examples of broader problems with enforcement of policies.

Facebook said in response that it has hired staff dedicated to “working with right-of-center organizations and leaders.”

President Donald Trump and many Republicans in Congress accuse various social media firms of anti-conservative bias, while tech companies and Democrats have rejected the claim.

Rep. David Cicilline, a Democrat who chairs a House panel on antitrust issues, questioned the review, noting the “‘audit’ was conducted by a conservative former Republican senator who now works as a federal lobbyist.”

Sen. Josh Hawley (R-Mo.) said the report wasn’t a real audit but a “smokescreen disguised as a solution. He said Facebook should conduct an actual audit by giving a trusted third party access to its algorithm, its key documents, and its content moderation protocols.”

Facebook and other large tech firms have acknowledged mistakes in handling some specific content issues.

Facebook spokesman Nick Clegg said in a blog post on Aug. 20 that the company needs “to take these concerns seriously and adjust course if our policies are, in fact, limiting expression in an unintended way.”

The Kyl report noted Facebook has made changes, including more transparent decisions on why people see specific posts, ensuring page managers can see enforcement actions, launching an appeals process and creating a new content oversight board made up of people with diverse ideological views.

Republican senators have held hearings over the last two years with Facebook, Twitter Inc., and Alphabet Inc.’s Google, accusing them of bias. Last month, two Republican senators asked the Federal Trade Commission to probe how major tech companies curate content.

Democrats say the bias allegations are without merit. Sen. Mazie Hirono (D-Hawaii) said in April that “we cannot allow the Republican party to harass tech companies into weakening content moderation policies that already fail to remove hateful, dangerous, and misleading content.”

The report noted Facebook’s advertising policies prohibit “shocking and sensational content” and the company has historically rejected images of “medical tubes connected to the human body.”

That resulted in some anti-abortion advertisements being rejected. Facebook has revised its policies to prohibit only ads depicting “someone in visible pain or distress or where blood and bruising is visible. This change expands the scope of advocacy available for groups seeking to use previously prohibited images.”

The report by Kyl—who represented Arizona in the U.S. Senate from 1995 to 2013 and again in 2018—focused on six areas of concern. These included how Facebook chooses content for readers, content rules such as those banning hate speech, potential political bias in content enforcement, ad policies such as the prohibition of “shocking or sensational content,” enforcement of ad policies, and a belief that Facebook’s workforce lacks political diversity.

By David Shepardson