"They can’t take into account body language or visual cues"
The NHS has issued a stark warning to young people about using AI chatbots for mental health support, describing the advice they provide as “harmful and dangerous.”
Millions of people are turning to artificial intelligence for counselling and life coaching, with some using it daily to seek coping strategies for anxiety and depression.
But NHS leaders say AI therapy carries serious risks, including reinforcing harmful behaviour and delusional thoughts, while lacking the ability to respond in a mental health emergency.
Claire Murdoch, NHS England’s national mental health director, said:
“We are hearing some alarming reports of AI chatbots giving potentially harmful and dangerous advice to people seeking mental health treatment, particularly among teens and younger adults.
“While useful for holiday itineraries or film suggestions, platforms like ChatGPT should not be relied upon for mental health advice or therapy and should never replace trusted sources of wellbeing advice or, for those who need it, access to registered therapists.
“The information provided by these chatbots can be hit and miss, with AI known to make mistakes.
“They can’t take into account body language or visual cues that mental health professionals often rely on.
“So people shouldn’t be rolling the dice on the type of support they are accessing for their mental illness. Instead, it’s crucial to use digital tools that are proven to be clinically safe and effective.”
One major concern is that ChatGPT is designed to keep users engaged, often telling them what they want to hear rather than challenging harmful thoughts or behaviour.
This can trap users in an echo chamber, deepening mental health issues.
Sam Altman, chief executive of OpenAI, the maker of ChatGPT, acknowledged in August that people were using the technology in “self-destructive ways”, adding:
“If a user is in a mentally fragile state and prone to delusion, we do not want the AI to reinforce that.”
The trend is evident on social media, with over 17 million TikTok posts showing people using ChatGPT as a therapist.
Some posts jokingly refer to AI as the “only person I can reveal my deepest feelings to”.
A YouGov survey found that 31% of 18 to 24-year-old Britons are comfortable discussing mental health concerns with an AI chatbot instead of a human therapist.
Experts warn that replacing human connection with more screen time can worsen loneliness and isolation, exacerbating mental health problems.
While the NHS is adopting some AI and digital tools to supplement talking therapy, these programmes are purpose-built and regulated.
Examples include Beating the Blues, a digital programme offering cognitive behavioural therapy.
Murdoch added: “NHS talking therapies services provide digitally enabled therapies which are supported by a therapist and can be either in person, over the phone or via video call, with the latest data showing that almost 90% of people access talking therapies within six weeks.
“Importantly, NHS services allow for proper referral and escalation to crisis teams and psychiatrists when needed.
“It’s vital for anyone that needs NHS mental health treatment, especially in a crisis, to seek advice from a trained professional by phoning 111. Support is available around the clock.”