
A new law designed to protect dating app users from the "vile crime" of cyberflashing has come into force, placing greater responsibility on tech firms.
The government highlighted that one in three teenage girls have received unsolicited sexual images, framing this legislative change as part of its broader commitment to combat online abuse and halve violence against women and girls.
From Thursday, social media and dating platforms are legally mandated to proactively detect and prevent unsolicited nude images from reaching their users.
This shift makes cyberflashing a priority offence under the Online Safety Act, moving beyond reactive measures to demand preventative action from companies.
The Department for Science, Innovation and Technology issued a warning that platforms failing to adhere to the new legislation could face substantial fines, potentially up to 10 per cent of their global revenue, or even have their services blocked across the UK.
Communications watchdog Ofcom will consult on new codes of practice stating what steps platforms must take to protect users but methods are expected to include automated systems to pre-emptively detect and hide such images, stricter content policies and moderation tools.

An Ofcom spokesperson said: “We’ll consult on updates to our codes of practice soon to reflect this change to the law, and we’ll hold platforms to account for protecting people from this despicable crime.”
Technology Secretary Liz Kendall said: “We’ve cracked down on perpetrators of this vile crime – now we’re turning up the heat on tech firms. Platforms are now required by law to detect and prevent this material.
“The internet must be a space where women and girls feel safe, respected and able to thrive.”
Safeguarding minister Jess Phillips said cyberflashing had, for too long, been “just another degrading abuse women and girls are expected to endure”.
She added: “By placing the responsibility on tech companies to block this vile content before users see it, we are preventing women and girls from being harmed in the first place.
“We will deploy the full power of the state to make this country safe for women and girls, both online and offline”

The Bumble dating app was the first to explicitly moderate cyberflashing – through an AI-powered feature which automatically detects and blurs nudity in images sent within chats and allows the user to choose to view, block, or report it – and has welcomed the change in the law.
Elymae Cedeno, the app’s vice president of trust and safety, said: “Receiving unsolicited sexual images is a daily violation that disproportionately impacts women and undermines their sense of safety online.
“Strengthening the law to make cyberflashing a priority offence is an important step towards ensuring platforms proactively address this behaviour to better protect members.”
The change comes in the same week Ms Kendall called on Elon Musk’s X to urgently deal with its artificial intelligence chatbot Grok being used to create sexualised deepfake images of people, including children.
She backed regulator Ofcom, which is looking into X and xAI, the firm founded by Mr Musk which created Grok, to take “any enforcement action” deemed necessary.
Users of social media platform X appear to have prompted Grok to generate images of children “in minimal clothing”.
Tech tycoon Mr Musk has previously insisted that “anyone using Grok to make illegal content will suffer the same consequences as if they uploaded illegal content”.
X has said it takes action against illegal content, including child sexual abuse material “by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary”.