KOTZ 720 AM and KINU 89.9 FM --- Based in Kotzebue, serving Northwest Alaska and beyond!
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

'How are you using AI?' Your therapist should ask you that question, experts argue

ChatGPT, Claude and Character.AI are chatbots powered by artificial intelligence that people are using increasingly.
Kiichiro Sato
/
AP
ChatGPT, Claude and Character.AI are chatbots powered by artificial intelligence that people are using increasingly.

Increasingly, teens and adults are turning to artificial intelligence chatbots for companionship and emotional support, recent studies and surveys show. And so, mental health care providers should inquire if and how their patients are using this technology, just like they seek information on sleep, diet, exercise and alcohol consumption.

That's according to a new paper out in JAMA Psychiatry.

"We're not saying that AI use is good or bad," says Shaddy Saba, an assistant professor at New York University's Silver School of Social Work, "just like we wouldn't say substance use is necessarily good or bad, [or] consulting with a friend about something is good or bad."

However, learning about a person's use of AI for emotional support and advice could provide valuable insight into someone's life and mental health status, he says.

"Our job is to understand why people are behaving as they are — in this case, why they are seeking help from an AI system," adds Saba. "And to learn about what it's doing for them, what it's not doing for them."

Saba and his co-author's recommendations are "very aligned" with recommendations by the American Psychological Association (APA) in a health advisory released in November of last year, says the APA's Vaile Wright.

Asking what a patient is getting out of their conversations with an AI chatbot sets "a foundation for the therapist to better know how they are trying to navigate their emotional wellbeing and their mental illness," says Wright.

"Treasure trove of information"

"People are using these tools on a regular basis to ask about how to cope with stressful experiences, personal relationship challenges," explains Saba. 

And some are using chatbots for advice on how to cope with symptoms of anxiety and depression.

"To the extent that we can prompt our clients to bring these conversations, in increasing detail, even into the therapy room, I think there's potentially a treasure trove of information," he says.

It could be information about the main causes of stress in someone's life, or if they are turning to a chatbot as a way to avoid confrontations.

"Let's say, for example, you have a client who is having relationship issues with their spouse," says the APA's Wright. "And instead of trying to have open conversations with their spouse about how to get their needs met, they're instead going to the chatbot to either fill those needs or to avoid having these difficult conversations with their spouse."

That background will help a therapist better support the patient, she explains.

"Helping them understand how to have a safe conversation with their spouse, helping them understand the limitations of AI as a tool for filling those gaps in those needs."

Discussing use of AI is also a chance to learn about things a client might not voluntarily share with a therapist, says psychiatrist Dr. Tom Insel, former director of the National Institute of Mental Health. "People often use the chatbots to talk about things that they can't talk about with other people because they're so worried about being judged," he says.

For example, suicidal thoughts may be something a patient is reluctant to share with their therapist, but that is critical for the therapist to know to keep the patient safe.

Be curious but don't judge 

When it comes to first broaching the subject with patients, Saba suggests doing it without any judgment.

"We don't want to make clients feel like we're judging them," he says. "They're just not going to want to work with us in general if we do that."

He recommends therapists approach the topic with genuine curiosity, and offers suggested language for these conversations.

"'You know, A.I. is something that's kind of rapidly growing, and I'm hearing from a lot of people that they're using things like ChatGPT for emotional support," he suggests. "'Is that the case for you? Have you tried that?'"

He also recommends asking specific questions about what they found helpful so they can better understand how a patient is using these tools.

It could also help a therapist figure out if a chatbot can complement therapy in helpful ways, says Insel, such as to vet which topics to bring to their sessions or to vent about day-to-day life.

In a way, therapy and chatbots "could be aligned to work together," says Insel.

Saba and his co-author, William Weeks also suggest asking patients if they found any chatbot interactions unhelpful or problematic, and also offering to share risks of using chatbots for emotional support.

For example, the risks to data privacy, because many AI companies use the conversations – even sensitive ones – to further train their models.

There are also risks of treating a chatbot like a therapist, says Insel.

Talking with a chatbot about one's mental health is "the opposite of therapy," he says, because chatbots are designed to affirm and flatter, reinforcing users' thoughts and feelings.

"Therapy is there to help you change and to challenge you," says Insel, ""and to get you to talk about things that are particularly difficult.

Adopting the advice

Psychologist Cami Winkelspecht has a private practice working primarily with children and adolescents in Wilmington, Del.

She has been considering adding questions about social media and AI use to her intake form, and appreciated Saba's study as it offered some sample questions to include.

Chat GPT's landing page on a computer screen.
Kiichiro Sato / AP
/
AP
Chat GPT's landing page on a computer screen.

Over the past year or so, Winkelspecht has had a growing number of clients and their parents ask her for help with using AI for brainstorming and other tasks in ways that don't break a school's honor code. So, she's had to familiarize herself with the technology in order to be able to support her clients. Along the way, she's come to realize that therapists and kids' parents need to be more aware of how children and teens are using their digital devices — both social media and AI chatbots.

"We don't necessarily think about what they're doing with their phones quite as much," says Winkelspecht. "And I think it's pretty clear that we need to be doing that more and encouraging ourselves to have that conversation."

Copyright 2026 NPR

Rhitu Chatterjee
Rhitu Chatterjee is a health correspondent with NPR, with a focus on mental health. In addition to writing about the latest developments in psychology and psychiatry, she reports on the prevalence of different mental illnesses and new developments in treatments.