Facebook to Add Warnings, Limit Visibility of Groups That Violate Platform Rules

Facebook announced Wednesday that it has implemented a number of changes on the social media platform that violate its standards.
Facebook to Add Warnings, Limit Visibility of Groups That Violate Platform Rules
A 3D-printed Facebook logo is seen placed on a keyboard in this illustration taken March 25, 2020. (Dado Ruvic/Illustration)
Isabel van Brugen
3/18/2021
Updated:
3/18/2021
Facebook announced on March 17 that it has implemented a number of changes on the social media platform that will reduce the reach of, and in some cases ban, groups that violate its policies.

The tech giant said in a statement that it has taken action to curb the spread of “harmful content, like hate speech and misinformation,” and is making it harder for certain groups to operate or be discovered.

“When a group repeatedly breaks our rules, we take it down entirely,” the announcement from Facebook’s vice president of engineering, Tom Alison, stated.

As part of the changes, Facebook will notify users who want to join a group if the group has previously approved posts that violate the platform’s “community standards.” Users can then opt to review the group before joining.

The social media platform stated it will also remove political and civic groups from all group recommendations.

“While people can still invite friends to these groups or search for them, we have now started to expand these restrictions globally. This builds on restrictions we’ve made to recommendations, like removing health groups from these surfaces, as well as groups that repeatedly share misinformation,” the company stated.

When a group begins to violate the platform’s community standards, Facebook will show them lower in its recommendations list, making it less likely that people will discover those groups, Alison said.

Group administrators and moderators will also be required by Facebook to “temporarily approve all posts when that group has a substantial number of members who have violated our policies or were part of other groups that were removed for breaking our rules.”

Further, users who repeatedly violate platform rules in groups will be temporarily barred from posting in any group, and will be banned from both creating new groups, and inviting others to groups.

The changes come a week before Facebook CEO Mark Zuckerberg is set to testify at a hearing on misinformation in front of the House Energy and Commerce Committee.

Facebook founder and CEO Mark Zuckerberg in Washington on April 10, 2018. (Samira Bouaou/The Epoch Times)
Facebook founder and CEO Mark Zuckerberg in Washington on April 10, 2018. (Samira Bouaou/The Epoch Times)

Facebook has come under increasing scrutiny over censorship of its users’ speech. Facebook has repeatedly claimed its platform doesn’t favor one political viewpoint over another.

Facebook whistleblower Ryan Hartwig, who lifted the lid on the social media giant’s pattern of censoring conservatives, told The Epoch Times’ “American Thought Leaders” program on Feb. 28 that censorship at the company has become “outrageous.”
Hartwig, a former content moderator at a third-party company that provided services to Facebook, made news in June 2020 when he alleged that moderators were told to enforce the social media platform’s policies selectively to allow, under certain circumstances, content that demonized the police or white males.

“I saw that Facebook gave exceptions to, essentially, silence conservatives,” he said in an interview during the Conservative Political Action Conference (CPAC) in Orlando, Florida.

Hartwig, who worked for nearly two years for Cognizant, a firm hired by Facebook to handle part of its manual content policing, said in the interview that Facebook provided guidance on a number of occasions that he said was biased in favor of left-leaning causes and perspectives.

Rep. Devin Nunes (R-Calif.) said at the CPAC event last month that censorship by Big Tech is “probably the biggest problem that we face right now in this country, as a party.”
Tom Ozimek contributed to this report.