Get all your news in one place.
100’s of premium titles.
One app.
Start reading
GoodToKnow
GoodToKnow
Lifestyle
Ellie Hutchings

Meta under fire for 'highly irresponsible' minimum age change on WhatsApp, as investigation finds children are being exposed to 'terrifying' content

A close up of a boy in school uniform using an iPhone.

Social media giant Meta has been criticised by campaign groups and politicians for lowering the minimum age of use on WhatsApp to 13.

Social media is becoming an increasing concern for parents. With worries around phone addiction and the impact on mental health, as well as a new study suggesting that online harassment of girls is 'so standard it's not noteworthy', it's no surprise that many are wondering if platforms like TikTok are safe for children and are looking for tips for keeping kids safe on social media.

It often feels like social media bosses are doing little to help ease parents' concerns, and instant messaging service WhatsApp has most recently come under fire after lowering its minimum age of use from 16 to 13.

Campaign groups have spoken out about the decision by Meta - the company that owns WhatsApp, as well as Facebook and Instagram - with Smartphone Free Childhood calling the move "tone deaf".

The campaign group's co-founder Daisy Greenwell said: "Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike."

She added, "Among parents, WhatsApp is seen as the safest social media app, 'because it's just messaging, right?'.

"And in that way it works like a gateway drug for the rest of the social media apps. If you’re messaging your friends on WhatsApp, why not message them on Snapchat?

"WhatsApp is far from risk-free. It’s often the first platform where children are exposed to extreme content, bullying is rife and it's the messaging app of choice for sexual predators due to its end-to-end encryption."

Meanwhile, Conservative MP Vicky Ford, a member of the Education Select Committee, said Meta's decision to reduce the age recommendation without consulting parents was "highly irresponsible".

A post shared by Smartphone Free Childhood

A photo posted by smartphonefreechildhood on

And the decision comes as parents have been left 'terrified' by malicious WhatsApp groups promoting self-harm, sexual violence and racism, according to a BBC investigation.

Schools said pupils in Years 5 and 6 were being added to the groups, and one headteacher discovered 40 children in one year group were involved, while a parent said her 12-year-old daughter had viewed sexual images, racism and swearing that "no child should be seeing".

“I immediately removed her from the group but the damage may already have been done," she said. "I felt sick to my stomach - I find it absolutely terrifying. She's only 12, and now I'm worried about her using her phone."

Thousands of parents with children at schools across Tyneside have been sent a warning about the WhatsApp groups, with a spokesperson for Northumbria Police saying, "We would encourage people to take an interest in their children’s use of social media and report any concerns to police".

Meta this week unveiled a range of new safety features designed to protect users, in particular young people, from 'sextortion' and intimate image abuse.

It confirmed it will begin testing a filter in Direct Messages (DMs) on Instagram, called Nudity Protection, which will be on by default for those aged under 18 and will automatically blur images sent to users which are detected as containing nudity.

When receiving nude images, users will also see a message urging them not to feel pressure to respond, and an option to block the sender and report the chat.

Meanwhile, Prime Minister Rishi Sunak told the BBC that the Online Safety Act would give the regulator powers to ensure social media companies are protecting children from harmful material.

If your child uses WhatsApp and you're worried about their safety, child protection charity NSPCC has shared some tips for parents:

  • Get to know the privacy settings: To prevent children being added to groups by people they don’t know, you can change the group settings to 'My contacts except'. This option means only your child’s phone contacts, except those you exclude, can add your child to groups. In 'privacy', you can also change who can see your child's information.
  • Make use of safety features: Show your child how to block and report other users of the app or inappropriate content. For information on how to block or report on WhatsApp visit the WhatsApp help center.
  • Talk about sharing: Talk to your child regularly about what they should and shouldn’t share with others on WhatsApp and remind them that, even if they think what they are sending will stay private, others might save, forward or screenshot it.
  • Set rules about location sharing: Decide with your child if it is appropriate for them to share their location with others and who they are allowed to share it with. You can disable location permissions by going into your device settings and switching off location services for WhatsApp.

In related news, media regulator Ofcom has said "violent content has become a normal part of children’s online lives", as they push for new online safety measures, and a new Facebook rollout means parents are being advised to “think carefully” about letting their children use the social media platform. Elsewhere, the seven social media concerns parents are most worried about have been revealed, as a new study shows half of teens admit they’re addicted to it.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.