Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Patricia Battle

Major dating apps are yanking their ads from Meta for a disturbing reason

Dating services Bumble and Match Group, which is a company that owns dating apps such as Tinder, Match.com, Hinge and more, are pulling their ads from Meta platforms. The move comes after a bombshell report from the Wall Street Journal unveiled that the companies' ads in Instagram Reels appeared next to sexually explicit content involving children.

The Wall Street Journal found these ads next to the explicit content after setting up test accounts on Instagram that followed “gymnasts, cheerleaders and other young influencers.”

Related: Instagram is still plagued by a disturbing issue that Meta says it's making headway on solving

The report notes that a Bumble ad appeared “between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff.”

It also states that along with Match, ads from companies such as Walmart and Disney also appeared next to child-sexualizing content on the platform.

The report notes that Bumble and Match have decided to suspend their ads on all of Meta’s platforms after being contacted by the Journal regarding its findings.

"Once we were notified that our ads were appearing in violation of our agreement with Meta as well as our company guidelines, we took immediate action to investigate and made the decision to suspend advertising on all Meta properties across our entire portfolio," a spokesperson for Bumble told TheStreet via email.

TheStreet reached out to Match for comment and did not receive a response in time for publication.

This is not the first time Instagram has come under fire for appearing to be unsuccessful in cracking down on sexually explicit content involving children on its platform.

More Technology:

In a 2022 report from The Guardian, Instagram was accused of failing to remove accounts that were reported through its in-app reporting tool for posting images of sexualized children.

After the release of the Journal’s report on Nov. 27, Meta, which owns Instagram, has condemned content of that nature from appearing on its platforms as well as ads running near them.

“We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it,” said spokesperson for Meta in a statement to TheStreet. “We continue to invest aggressively to stop it - and report every quarter on the prevalence of such content, which remains very low. Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions.”

Simplify the pulse of the market landscape with bite-sized intel from the masters. Real Money Pro is your dynamic financial ally, transforming market insights into strategic moves. Start your membership to elevate your portfolio.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.