Facebook is working toward reducing the amount of political content on the platform following feedback from its users, the company CEO Mark Zuckerberg said on Wednesday.
“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” Zuckerberg wrote in a post.
“We’re currently considering steps we could take to reduce the amount of political content in News Feed as well. We’re still working through exactly the best ways to do this.”
He said that the platform would still allow people to join political groups such as grassroots movements and join discussions, for example, to speak out against injustice. But the platform will cease recommending civic and political groups for the long term—a policy they expect to expand globally.
Facebook was one of the Silicon Valley companies that began engaging in more targeted moderation of user’s content following criticism that it allowed its users to spread misinformation. In the past few months, it has deployed a number of policies aimed at stopping misinformation and preventing violence from its users.
In the lead up to and around the election, Facebook began increasing its policing of posts from former President Donald Trump that raised concerns about election integrity and alleged voter fraud.
Following the Jan. 6 U.S. Capitol breach, the company ramped up its policy enforcement, removing groups and speech that it claimed could incite violence and hate. It followed Twitter in locking Trump from his account.
In the lead up to Inauguration Day, the company said that it was removing all content that contained the phrase “stop the steal.” The phrase was used by supporters of Trump who were calling on Congress and state legislators to investigate the integrity of the 2020 general election. Multiple rallies under the “Stop the Steal” slogan were seen across the country following the Nov. 3 election.
Facebook has also banned ads for weapons, ammunition, and firearm enhancements such as silencers, and later announced that it was temporarily banning ads that promote firearm accessories and protective gear in the United States.
The company said the moves were necessary to prevent users from using its platform to engage in violence.
The company’s enforcement of its policy has been repeatedly criticized for alleged unbalanced moderation, with critics saying that much of the policing targets conservatives and people deemed supporters of Trump. An undercover investigation by Project Veritas released in June 2020 revealed that at least one of Facebook’s algorithms seemed designed to flag predominantly right-leaning content, according to a former moderator.
The investigation also recorded a Facebook moderator saying that the company applies different standards between left-leaning and right-leaning content.
Facebook did not respond to The Epoch Times’ request for comment on Project Veritas’s report.
Shortly afterwards in August, Facebook announced that it would be taking enforcement action against QAnon supporters, Antifa extremists, and U.S.-based militia organizations. In October, after Trump was diagnosed with CCP (Chinese Communist Party) virus, Facebook said it would remove posts that hoped for Trump’s death.
Perceived unbalanced moderation of users’ content by social media companies has raised concerns over First Amendment rights and a lack of checks and balances for decisions made by these Big Tech companies. Efforts to limit or eliminate liability protections under Section 230 of the Communications Decency Act for companies that have engaged in censoring or political conduct have been heavily discussed in the past year.