Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
Technology
Andrew Griffin

Experts raise alarm about people using ChatGPT and other AI systems to help with loneliness

Gates used ChatGPT, a popular AI chatbot from OpenAI, to 'investigate lethal and incapacitating prescription drug combinations,' according to official documents - (AFP/Getty)

Experts have sounded the alarm over a growing use of AI systems such as ChatGPT to help with people’s loneliness.

The systems are increasingly being relied on as a kind of confidant or friend by a number of people. But a new report in the British Medical Journal warns that relying on such chatbots could be a cause for concern, especially in young people.

They also call for new strategies to help address the loneliness and isolation that would bring people to speak to chatbots for such a reason in the first place. Doctors have long warned that loneliness is in itself a public health concern – and two years ago the US Surgeon General said that it was an epidemic of a similar concern as smoking.

In that context, “we might be witnessing a generation learning to form emotional bonds with entities that lack capacities for human-like empathy, care, and relational attunement”, Susan Shelmerdine and Matthew Nour write in the BMJ article.

Studies have even suggested that people are actually more satisfied when having serious conversations with AI tools than they are doing so with other humans, they note.

Clinicians should begin to think about whether people using chatbots in potentially problematic or dangerous ways is an environmental risk factor when evaluating someone’s mental state, they note.

That might mean doctors making a gentle enquiry about how people use chatbots, especially if people are particularly at risk of loneliness. They might then ask specific questions about the way they use and even depend on speaking to such systems, they suggest.

The article does acknowledge that such AI systems might bring improvements to many patients, including those experiencing loneliness. But it notes that at the moment there is little way of evaluating whether people’s use of such systems is healthy, and that the creators of such tools may be judging their success on “superficial and myopic engagement metrics” rather than prioritising “long term wellbeing”.

The article, ‘AI chatbots and the loneliness crisis’, is published today in the BMJ.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.