Ofcom has warned that online sites, including popular social media platforms, will face hefty fines and enforcement action, if they fail to reform their algorithms recommending harmful content to children.
In practice, that means popular social media sites, like Facebook, Instagram and Snapchat will have to implement “highly effective age-checks.”
In some cases, this will mean preventing children from accessing the online service altogether.
“ … platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms which too readily mean they come across harmful material online,” said Technology Secretary Michelle Donelan.
Online platforms can verify a user’s age accessing the information their bank has on record, with the user’s consent. Photo ID matching, facial age estimation and credit card checks are also among age assurance methods Ofcom considers highly effective.
The regulator also warned content providers that they should filter out the most harmful content on children’s feeds.
This can be done by configuring the algorithms, which provide personalised recommendations to users.
Ofcom Chief Executive Dame Melanie Dawes said that the guidance goes “beyond current industry standards” and that the watchdog won’t hesitate to use the “full range of enforcement powers” to hold platforms accountable.
Enforcement
The draft Children’s Safety Codes of Practice comes under the Online Safety Act, passed in Oct. 2023. The bill created a duty of care for online services to safeguard users from illegal or legal but “harmful” content.The chief executive of the NSPCC, Sir Peter Wanless, said that tech companies will be legally required to ensure the safety of their platforms, when it comes to underage users.
Once approved by Parliament, the Codes will come into effect and Ofcom can begin enforcing the regime.
“To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines—step up to meet your responsibilities and act now,” said Dame Melanie.
Ofcom highlighted evidence how children get influenced by harmful content online, suggesting current moderation efforts are not enough.
Bereaved Families
Ian Russell, whose daughter Molly took her life aged 14 after viewing disturbing content on social media, said that more needs to be done to protect children.Speaking on BBC “Breakfast” on Wednesday, Mr. Russell said that tech companies are delaying to move on stricter controls.
Another parent, whose 13-year-old son died after taking part in a dangerous social media challenge, argued that platforms, like TikTok, should not feed such content to under-18s.
Esther Ghey, mother of 16-year-old Brianna Ghey, who was murdered in 2023, after a premeditated attack by two teenagers, has been campaigning to ban social media apps to all under 16-years-olds.
One of her daughter’s killers had accessed a dark web app to watch torture and snuff videos in “red rooms.” Ms. Ghey told the BBC that she believes social media algorithms are “brainwashing” young people.
Ofcom plans to publish the final Children’s Safety Codes of Practice within a year.
“Services will then have three months to conduct their children’s risk assessments, taking account of our guidance, which we have published in draft today,” the regulator said.