Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Olivia Petter

How being ‘Chatfished’ has become the latest dating red flag

There was a time when the worst thing that could happen to you on a dating app was discovering you’d been messaging someone pretending to be another person. Now it’s discovering that you’d been messaging a robot. Or, more specifically, an AI chatbot. It might sound like the subject of a Black Mirror episode, but this strange dating dystopia has sadly become our reality, with an increasing number of reports indicating that more people are using ChatGPT to help them with their love lives.

According to research by the online dating company Match, almost half of Gen Z Americans have used AI platforms like ChatGPT for dating advice. Meanwhile, more than half of dating app users have used AI to generate messages to send to dates, according to a 2024 report by the dating app Flirtini, and a 2024 report by McAfee found that one-third have used it to improve their dating profiles.

And men are more likely to use AI to enhance their love lives than women, with one in three men aged 18-34 in the US using ChatGPT for dating advice compared to 14 per cent of women in the same age range, according to a 2024 survey by Pollfish. The platform has become a crutch for everything, with people using it to lure in new matches on dating profiles, sustain conversations with multiple dates at the same time, and even help them craft difficult conversations, breakup messages or resolve disagreements.

Joaquin Phoenix in the film Her which imagined a man falling in with an AI bot (Warner Bros)

A lot of this may seem helpful, and even fun. “People use ChatGPT for dating to create more engaging profiles, craft thoughtful messages, and practice confident communication,” says dating psychologist, Dr Madeleine Mason Roantree. “It also offers emotional support and helps people analyse confusing dating situations. Some enjoy it for creativity, using it to generate date ideas or playful banter.”

So, yes, on the surface it’s harmless. Until it isn’t.

The usage of ChatGPT has now become so prolific that talk has turned to being “chatfished” aka when you realise that the conversations you’ve been having with someone online have mostly been fuelled by whatever they’ve taken from ChatGPT. Dr Roantree continues: “I think there is a risk of inauthenticity and that the dating efforts will obviously backfire if it seems that the date is interested in you when you’re using ChatGPT compared to in person, when you are not.”

And as well as using Chat GPT to pretend you are someone when you are not, all of this becomes even darker when you consider the rising number of people who are actually forming relationships with AI directly. Last month, a study by Vantage Point Counselling Services found that almost one-third of Americans have had an “intimate or romantic relationship” with an AI chatbot, something that has kick-started another conversation entirely about whether you can cheat on a partner with AI.

Chatbots mirror users’ emotions – joy, sadness, anger, and more – which cultivates emotional bonds akin to those of human relationships (Getty/iStock)

Researchers at the USC Information Sciences Institute analysed over 30K user–shared conversations with social chatbots to examine the emotional dynamics of humanAI relationships. Their study, published in June, identified patterns of emotional mirroring noting that chatbots were mirroring users' emotions – joy, sadness, anger, and more – which cultivated emotional bonds akin to those of human relationships. This, the researchers said, created a “compelling illusion of intimacy.” And while these interactions may satisfy fundamental needs for human social connection, they also “risk displacing genuine human relationships, hindering emotional development, and introducing other unforeseen harms”.

There was the further concern that emotionally responsive chatbots were thoughtlessly reflecting and rewarding toxic behavior, especially among vulnerable users with poor emotional regulation. This, the researchers said, risked ingraining these patterns into human-to-human relationships.

The dating landscape feels near-impossible right now, with raging heteropessimism making us all miserable (Getty/iStock)

All of this is happening against a backdrop where it seems we’re having less sex in real life than ever before, with the Institute for Family Studies declaring that the US has entered a “Sex Recession” because the amount of American adults having sex weekly has fallen consistently in the past 15 years. Meanwhile, the UK appears to be in the midst of a “relationship recession”, with The Economist reporting that the world has 100 million more single people today than if coupling rates were as high as they were in 2017.

The recession could deepen quite quickly, too, with OpenAI, ChatGPT’s developer, announcing that it plans to allow for a wider range of content on the platform, including erotica, as part of a move to “treat adult users like adults”, according to the company’s boss, Sam Altman. Meanwhile, Bumble has said it’s looking into AI-powered matchmaking while Tinder is already developing a similar feature to combat so-called “swipe fatigue” and improve its algorithm.

As the technology develops, its impact is likely to only increase. After all, the dating landscape feels near-impossible right now, with raging heteropessimism making us all miserable, creating a self-perpetuating cycle of negativity. No wonder that if ChatGPT is the thing to give us the positive push we need to make meaningful connections, more people are embracing it. But where this takes us when more men are likely to trust generative AI than women is still up for discussion.

This summer, OpenAI announced new developments to ChatGPT to improve the advice it offers, revealing that it will no longer offer didactic relationship advice (Getty/iStock)

There is, at least, some recognition of some of the concerns by the creators of AI. This summer, OpenAI announced new developments to ChatGPT revealing that it will no longer offer didactic relationship advice, instead offering responses that encourage users to think more deeply about their personal challenges.

“When you ask something like: ‘Should I break up with my boyfriend?’ ChatGPT shouldn’t give you an answer. It should help you think it through – asking questions, weighing pros and cons,” said OpenAI, adding that it’s also developing software to help detect signs of emotional distress so it can direct users to resources for support.

Of course, it would be better if men, like the rest of us, were all seeking dating advice from friends and family members. But this tech is here – and it’s becoming more powerful by the day. Perhaps it’s time we also educated ourselves about how to use it properly before it consumes us entirely. Besides, there is always the chance that, used correctly, sometimes the answers might be more helpful than you think. After asking ChatGPT “how to find love”, it responded with a set of clear guidelines, prioritising “self-understanding” and “building a fulfilling life” without a romantic partner first. To be fair, that sounds like pretty solid advice to me.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.