Meta Accused of Being ‘Tone Deaf’ After Reducing Minimum Age on WhatsApp

The social media corporation Meta has reduced the age limit for using WhatsApp in Britain from 16 to 13 despite concerns from campaigners.
Meta Accused of Being ‘Tone Deaf’ After Reducing Minimum Age on WhatsApp
A smartphone and a computer screen displaying the logos of the Instagram, Facebook, WhatsApp, and their parent company Meta in Toulouse, southwestern France, on Jan. 12, 2023. (Lionel Bovaventure/AFP via Getty Images)
Chris Summers
4/12/2024
Updated:
4/12/2024
0:00

Social media giant Meta has been accused of being “tone deaf” after it lowered the minimum age for using WhatsApp in Britain and the European Union from 16 to 13.

It comes as campaigners are trying to persuade the government to ban the use of smartphones by those under the age of 16 because of fears of cyberbullying and other threats.

A poll commissioned by the charity Parentkind suggested 58 percent of parents would support a ban on under-16s using smartphones.

In March 2021, Mia Janin, 14, a pupil at the Jewish Free School in north London, committed suicide after enduring cyber-bullying.

At an inquest in January this year, coroner Tony Murphy concluded she, “took her life while still a child and while still in the process of maturing into adulthood.”

Undated family handout photo of Mia Janin, who killed herself—after being bullied on social media—in Barnet, north London on March 12, 2021. (Family handout/PA)
Undated family handout photo of Mia Janin, who killed herself—after being bullied on social media—in Barnet, north London on March 12, 2021. (Family handout/PA)

The inquest heard from Rabbi Howard Cohen, a former deputy head teacher at the school, who said there was a culture of “boys-only bravado groups” sharing images of girls.

Rabbi Cohen said he was aware of a WhatsApp group in which boys rated the “attractiveness” of female pupils.

MP Says Meta ‘Highly Irresponsible’

Conservative MP Vicky Ford, who sits on the education select committee, said it was “highly irresponsible” of Meta to reduce the age recommendation without consulting parents.
Meta’s change to the WhatsApp age limit came into force on Thursday, and the co-founder of the campaign group Smartphone Free Childhood, Daisy Greenwell, told The Times, “WhatsApp is putting shareholder profits first and children’s safety second.”

“Reducing their age of use to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike,” she added.

Ms. Greenwell said parents often think of WhatsApp as harmless but she said, “WhatsApp is far from risk-free. It’s often the first platform where children are exposed to extreme content, bullying is rife and it’s the messaging app of choice for sexual predators due to its end-to-end encryption.”

Prime Minister Rishi Sunak told the BBC the Online Safety Act would give the regulator, Ofcom, powers to ensure social media companies like Meta protect children from harmful material.

He said, “They shouldn’t be seeing it, particularly things like self-harm, and if they don’t comply with the guidelines that the regulator puts down there will be in for very significant fines, because like any parent we want our kids to be growing up safely, out playing in fields or online.”

WhatsApp said the reducing of the age limit would bring the UK and the EU in line with the majority of countries.

Meta Testing Nudity Filter on Instagram

Meta this week unveiled new safety features designed to protect users from “sextortion” and “cyber-flashing”.

It said it would begin testing a nudity protection filter which would operate in Instagram direct messages (DMs), which will be the default sitting for those under 18 and will automatically blur indecent images.

Ofcom’s director of online safety strategy, Mark Bunting, told BBC Radio 4’s “Today” programme the watchdog was currently writing codes of practice for enforcing online safety but their powers to regulate social media will only come into effect from 2025.

He said, “When our powers come into force next year, we’ll be able to hold them to account for the effectiveness of what they’re doing.”

“If they’re not taking those steps at that point, and they can’t demonstrate to us that they’re taking alternative steps which are effective at keeping children safe, then we will be able to investigate,” added Mr. Bunting.

He said of social media services, “We’ve made recommendations that services shouldn’t prompt children to expand their network of friends, not recommend children to other users, and crucially, not allow people to send direct messages to children that they’re not already connected with.”

PA Media contributed to this report.
Chris Summers is a UK-based journalist covering a wide range of national stories, with a particular interest in crime, policing and the law.