- UK regulators, the Information Commissioner’s Office (ICO) and Ofcom, have issued an urgent plea to social media companies to strengthen age-checking processes for children's online safety.
- In an open letter, they accused platforms including Meta, Snap, and TikTok of failing to prioritise children's safety and demanded explanations for their age checks and grooming protections by the end of April.
- The watchdogs warned that existing age barriers are easily bypassed, leaving children vulnerable to risks such as self-harm, misogyny, and exploitation.
- Ofcom stated it would publicly report on the platforms' responses in May and is prepared to take enforcement action, potentially strengthening regulatory requirements, if not satisfied.
- This action follows protests against Meta's "addictive" algorithms and a £14m fine issued by the ICO to Reddit for failing to protect child users.
IN FULL