Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Nicole Wootton-Cane

Apps that can ‘strip’ victims still available on Apple and Google stores

Apps that allow users to create AI-generated “nude” photos of real people are still available in Apple and Google app stores.

The creation of sexually explicit deepfakes is illegal in the UK following outrage over the use of Elon Musk’s Grok to generate sexualised images of women and children.

But The Independent found several apps that can be used to “strip” photos are still downloadable from the country’s two biggest app stores.

It comes after a Tech Transparency Project (TTP) investigation found 55 apps that can digitally remove clothes from women and show them as completely or partially naked or in minimal clothing in the US version of the Google Play Store. Similarly, it found 47 such apps available in the US Apple App Store.

A search by The Independent showed several similar apps, as well as apps named in the TTP investigation, are also available in the UK versions of the app stores.

The Google Play Store policy on inappropriate content states: “We don’t allow apps that contain or promote sexual content or profanity, including pornography, or any content or services intended to be sexually gratifying.

“We don’t allow apps or app content that appear to promote or solicit a sexual act in exchange for compensation. We don’t allow apps that contain or promote content associated with sexually predatory behaviour, or distribute non-consensual sexual content.”

Apple said it had removed 28 of the 47 apps flagged by the TTP (PA Wire)

Apple App Store policy states apps “should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy”.

It said it forbids “overtly sexual or pornographic material”, defined as “explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings”.

But both platforms hosted apps that allowed pictures of women to be digitally stripped.

One app highlighted in the TTP investigation was able to generate a video of a woman taking her top off and dancing from a photo they uploaded. The app is still available in both stores as of Friday afternoon, and has been downloaded more than 5 million times.

Another app available on the Google Play Store advertises the ability to “try on” clothing and shows images of women placed in bikinis.

Apple said it had removed 28 apps that the TTP identified in its report and contacted the developers of others to give them a chance to rectify guideline violations. Google also appears to have removed some apps.

Following the Grok controversy, women’s rights campaigners, including Refuge, Women’s Aid and Womankind Worldwide, said the “disturbing” rise in AI intimate image abuse has “dangerous” consequences for women and girls, including to their safety and mental health.

Emma Pickering, head of technology-facilitated abuse and economic empowerment at charity Refuge, said: “As technology evolves, women and girls’ safety depends on tighter regulation around image-based abuse, whether real or deepfake, as well as specialist training for prosecutors and police.

“Women have the right to use technology without fear of abuse, and when that right is violated, survivors must be able to access swift justice and robust protections.”

The Independent has contacted Google and the government for comment.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.