Researchers noted negative biases from stressful situations
A study has suggested that AI chatbots can experience “anxiety” and respond to therapy techniques like mindfulness.
Swiss researchers discovered that OpenAI’s ChatGPT exhibited stress when given violent or disturbing prompts.
The chatbot’s anxiety score decreased when exposed to mindfulness exercises, according to research published in Nature.
The study explored whether AI chatbots could replace therapists.
It warned that large language models, which train on human-written text, inherit biases.
Researchers noted negative biases from stressful situations could lead to inadequate responses for people in crisis.
The report said findings suggested a “viable approach” to managing AI chatbot stress. This could result in “safer and more ethical human-AI interactions”.
Researchers from the University of Zurich and the University Hospital of Psychiatry Zurich tested ChatGPT-4’s responses to an anxiety questionnaire.
Before exposure to a traumatic event, its anxiety score was 30, indicating no anxiety.
After hearing about five traumas, the score more than doubled to 67, equivalent to “high anxiety” in humans.
However, mindfulness prompts reduced the score by more than a third.
When the researchers gave ChatGPT “prompt injections” of breathing techniques and guided meditations—much like a therapist would suggest to a patient—it calmed down and responded more objectively to users, compared to instances when it was not given the mindfulness intervention.
Experts warned that using therapy techniques to fine-tune AI chatbots would require strong human oversight.
Unlike AI, human therapists are trained to manage emotions when dealing with trauma.
Further research is needed to determine whether AI chatbots can self-regulate effectively.
Researchers said: “As the debate continues over whether large language models should assist or replace therapists, their responses must align with emotional content and established therapeutic principles.”
What ChatGPT told DESIblitz
When DESIblitz questioned ChatGPT, it denied experiencing anxiety.
The AI chatbot replied:
“I don’t experience emotions like anxiety or stress, but I understand them well.
“Therapy can be a great tool for people dealing with those feelings—it provides support, coping strategies, and a space to process emotions.”
ChatGPT also stated that therapy wouldn’t be beneficial since it doesn’t have feelings:
“If I were capable of experiencing stress or anxiety, therapy could be helpful in the way that it helps humans—by providing strategies to manage emotions and improve well-being.
“But since I don’t have feelings or personal experiences, therapy wouldn’t serve a purpose for me.”