Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - US
The Guardian - US
Technology
Katie McQue and Mei-Ling McNamara

Meta’s new parental tools will not protect vulnerable children, experts say

Meta this week introduced new parental supervision tools.
Meta this week introduced new parental supervision tools. Photograph: Rachel Torres/Alamy

Social media giant Meta this week introduced new parental supervision tools, but child protection and anti-sex trafficking organizations say the new measures offer little protection to the children most vulnerable to exploitation, and divert the responsibility from the company to keep its users safe.

On Tuesday, Meta launched new features aimed at increasing parents’ awareness of their children’s activities on its platforms. For Messenger, its private message service, parents can now view and receive updates on their child’s contacts list and monitor who views any stories their child posts. On Instagram, the company has introduced a new notice to alert parents if their child has blocked somebody.

But safety features that rely on engaged families may mean that new measures may not protect children who lack consistent supervision of a parent or guardian, such as those in the child welfare system and living in group homes, experts warn.

“An approach to safety that puts the onus on parents and carers is not enough on its own. Many young people may not be able to speak to a parent about online concerns, particularly children in care,” said Rani Govender, senior child safety online policy officer at the National Society for the Prevention of Cruelty to Children (NSPCC), a UK-based child protection charity. “Many parents will not have the technical knowledge or time to supervise their child’s social media use.

A 2020 report from the Human Trafficking Institute (HTI), which includes the most recent child trafficking statistics across social media, found Facebook to be the site most often used to recruit and groom child trafficking victims (65%), with Instagram and Snapchat ranking second and third. Child sex trafficking is defined as the sexual exploitation of a child specifically as part of a commercial transaction and according to US law, minors under 18 cannot consent to their own exploitation.

“Exploiters look for children online. In earlier days, they would look for them at the mall, but now they are looking for them on social media. Then they target that person and build a relationship with them,” said Lisa Goldblatt Grace, co-founder and director of My Life My Choice, a Boston-based non-profit organization supporting survivors of child sex trafficking.

In 2022, 84% of the trafficked children the organization served were in the care of the child welfare system, she says. Across the US, the National Foster Youth Institute estimates that as much as 60% of total child sex trafficking victims have been in foster care or other group homes.

“When it comes to commercial sexual exploitation of children, we know that young people who do not have safe and invested parents are disproportionately at risk,” said Goldblatt Grace.

A Guardian investigation in April revealed how Meta is failing to report or detect the use of Facebook and Instagram for child trafficking and uncovered how Messenger is being used as a platform for traffickers to communicate to buy and sell children.

“These new tools assume they have a parent or guardian watching them on social media,” said Tina Frundt, the founder of Courtney’s House, an organization supporting minority victims of child sex trafficking in Washington, DC. “Sex traffickers have groups all over Instagram and this is how they find kids. These are kids who are the most vulnerable in society, who may have a lack of parental support, mental health issues or little self-esteem.”

In June, Meta disclosed it had set up a taskforce to investigate Instagram’s role in the distribution and sale of child sexual abuse material.

However, Meta has undergone several rounds of layoffs since November amid plans to eliminate about 21,000 jobs to cut costs. Some of these cuts occurred in the company’s content moderator teams, where employees are tasked with detecting and reporting child sex abuse material and other graphic and abusive content on the platforms.

“Harms happening on digital platforms are continuously being framed as problems to be solved through increased user responsibility and parental intervention, rather than through meaningful systemic change,” said Lianna McDonald, executive director at the Canadian Centre for Child Protection, a charity focused on child safety.

The Canadian Centre for Child Protection and NSPCC have repeatedly called for governments to introduce regulations that address the online safety of children.

“Meta has a fundamental responsibility to look at their sites and the algorithms that they use. Child safety online can feel like an uphill battle for even the most present of parents,” said Goldblatt Grace. “Meta has a responsibility to make its social media platforms safer for kids.”

In response to the Guardian’s request for comment, Sophie Voegel, a spokesperson for Meta, said: “The exploitation of children is a horrific crime – we don’t allow it and we work aggressively to fight it on and off our platforms. We proactively aid law enforcement in arresting and prosecuting the criminals who perpetrate these grotesque offenses.” She added that Meta had removed “over 34m pieces of child exploitation and trafficking content between October and December 2022 and have also reported tens of thousands of accounts of suspected traffickers over many years to the National Center for Missing and Exploited Children, which has repeatedly recognized us as an industry leader in the fight to keep young people safe online”.

“Far from replacing them, our parental supervision tools are intended to complement our existing safeguards to help protect teens from unwanted contact,” Meta also said. “These include defaulting teens into private accounts when they sign up to Instagram, preventing people over 19 from sending private messages to teens who don’t follow them and preventing adults who have shown potentially suspicious behaviour from finding, following and interacting with teen accounts.”

• In the US, call or text the Childhelp abuse hotline on 800-422-4453. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.