Johnson Vows to Fine Social Media Companies Failing to Remove ‘Hate and Racism’

Johnson Vows to Fine Social Media Companies Failing to Remove ‘Hate and Racism’
Prime Minister Boris Johnson speaks during Prime Minister's Questions in the House of Commons, in London on July 14, 2021. (House of Commons via PA)
Lily Zhou
7/14/2021
Updated:
7/14/2021

Social media companies have been told they will have to remove “hateful” or “racist” content from their platforms or face hefty fines, the UK’s Prime Minister Boris Johnson said on Wednesday.

The prime minister met with representatives from social media companies on Tuesday after online racial abuse towards black England football players sparked outrage.

“Last night I met representatives of Facebook, of Twitter, of TikTok, of Snapchat, of Instagram and I made it absolutely clear to them that we will legislate to address this problem in the Online Harms Bill, and unless they get hate and racism off their platforms, they will face fines amounting to 10 percent of their global revenues,” Johnson said.

He also vowed to change the football banning regime to cover online abuse. The orders ban people from football matches if they are convicted of a “relevant offence” linked to a match.
Greater Manchester Police on Wednesday arrested a man on suspicion of an offence under Section 1 of the Malicious Communications Act over racial abuse. The man remains in custody for questioning.

Online Safety Bill

The UK government has previously introduced new legislation giving it powers to fine social media companies.
It published in May the draft (pdf) of an ambitious Online Safety Bill, which was promoted as a “world-leading approach“ to regulate the online space.

The bill introduces a “duty of care,” requiring social media sites, websites, apps, and other services hosting user-generated content or allowing people to talk to others online to “remove and limit the spread of illegal and harmful content such as child sexual abuse, terrorist material, and suicide content.”

According to the government statement, the bill also seeks to tackle “racist abuse,” scams, and “disinformation.”

The UK’s communications regulator Ofcom will have the power to fine companies up to 18 million pounds ($25 million) or 10 percent of annual global turnover, whichever is higher, and to block access to sites if the companies fail their “duty of care.”

If companies still “don’t step up their efforts to improve safety,” a new criminal offence may be introduced for senior managers.

The government said the bill also aims to uphold democratic debate online by requiring companies to protect content defined as “democratically important,” safeguarding users’ access to journalistic content shared on their platforms and forbidding them from discriminating against particular political viewpoints.

But the bill has been lambasted by the civil liberty group Index on Censorship, which said it’s “catastrophic for freedom of speech.”