Get all your news in one place.
100’s of premium titles.
One app.
Start reading
PC Gamer
PC Gamer
Jess Kinghorn

Folks falling for LLM chatbots often end up with AI girlfriends 'unintentionally,' claims new study

A stock photo of a human hand reaching for red rose held in a robotic hand.

A new study out of MIT offers a first-of-its-kind large-scale computational analysis exploring the how and why of folks falling for AI chatbots. The research team dove into subreddit r/MyBoyfriendIsAI, a community of folks that sometimes ironically, sometimes more seriously refer to AI bots like ChatGPT as their romantic other half. The team found that many users' "AI companionship emerges unintentionally through functional use rather than deliberate seeking."

That means that while a user may first begin using an AI chatbot to, say, redraft an email or research caselaw that doesn't exist, an attachment can form over the course of initially aromantic prompts. The full paper elaborates: "Users consistently describe organic evolution from creative collaboration or problem-solving to unexpected emotional bonds, with some users progressing through conventional relationship milestones, including formal engagements and marriages."

The research team's findings are drawn from a sample of "1,506 posts collected between 2024 and 2025" from the aforementioned 27,000+ strong subreddit. The researchers note that the official Reddit API limited them to looking at the "top-ranked posts" rather than absolutely everything, though they argue that this snapshot still "captures the most engaged-with content and represents diverse conversation topics that resonate most strongly within the community."

Another limitation of this Reddit-based sample is that it's hard to draw any conclusions about user demographics; just because the subreddit mentions boyfriends, it would be unfounded to assume only straight women are posting or turning to LLMs for companionship.

The subreddit doesn't just discuss AI boyfriends either, with the community explicitly welcoming posts about relationships encompassing "all gender configurations for both humans and AI entities." So, yes, your tiny anime girl cyberprison would be right at home here.

(Image credit: Dipal)

Members of the subreddit were observed not only generating pictures of themselves and their artificial beloved but also wearing physical rings to symbolise their 'AI marriage.' Users also claim a number of benefits arising from, as the team puts it, these "intimate human-AI relationships," including "reduced loneliness, always-available support, and mental health improvements."

The research team found that "10.2% [of posters within the sample] developed relationships unintentionally through productivity-focused interactions, while only 6.5% deliberately sought AI companions."

Interestingly, a larger portion of users within the sample (36.7%) described forming attachments with general purpose Large Language Models like ChatGPT, rather than "purpose-built relationship platforms like Replika (1.6%) or Character.AI (2.6%)."

It's not an entirely rosy picture, though. While 71.0% of the material analysed detailed no negative consequences, "9.5% acknowledge emotional dependency, 4.6% [described] reality dissociation, 4.3% avoid real relationships, and 1.7% mentioned suicidal ideation" as a result of this AI companionship. The paper goes on to say, "These risks concentrate among vulnerable populations, suggesting AI companionship may amplify existing challenges for some while providing crucial support for others."

(Image credit: hapabapa via Getty Images)

The research paper is intended to bridge a "critical knowledge gap [...] in understanding human-AI relationships," attempting to investigate the subject with "a non-judgmental analytical approach aimed at benefiting both the studied community and broader stakeholders."

With some users reporting emotional dependency on LLMs and even grief-like responses in the wake of model updates, it's certainly not hard for me to see why this phenomenon is worthy of study. Even as OpenAI is working to temper ChatGPT's responses to emotionally "high stakes" conversations and denies any plans for 'anime sex bots', I don't doubt there will continue to be a large number of people compelled by the fantasy of an always available, never frustrated, never tired conversational 'partner'.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.