Saba and his co-author’s recommendations are “very consistent” with the American Psychological Association’s (APA) recommendations in health advice published in November since last year, says APA Vale Wright.
Asking what a patient gets out of their conversations with an AI chatbot sets “a foundation for the therapist to know better how they’re trying to manage their emotional well-being and their mental illness,” Wright says.
“Treasure trove of information”
“People regularly use these tools to ask how to deal with stressful experiences, challenges in personal relationships,” Saba explains.
And some use chatbots for advice on how to deal with symptoms of anxiety and depression.
“To the extent that we can get our clients to bring these conversations, in ever greater detail, even into the therapy room, I think there’s a potential treasure trove of information,” he says.
This could be information about the underlying causes of stress in someone’s life, or if they are turning to a chatbot as a way to avoid confrontations.
“Let’s say, for example, you have a client who is having relationship problems with her husband,” says APA’s Wright. “And instead of trying to have open conversations with their spouse about how to get their needs met, they instead go to the chatbot to get those needs met or to avoid those difficult conversations with their spouse.”
This experience will help the therapist better support the patient, she explains.
“To help them understand how to have a safe conversation with their spouse by helping them understand the limitations of AI as a tool to fill those gaps in those needs.”
Discussing the use of AI is also a chance to learn about things a client might not willingly share with a therapist, psychiatrist says Dr. Tom Islandformer director of National Institute of Mental Health. “People often use chatbots to talk about things they can’t talk about with other people because they’re so worried about being judged,” he says.
For example, suicidal thoughts may be something a patient is reluctant to share with their therapist, but it is critical for the therapist to know in order to protect the patient.
Be curious, but don’t judge
When it comes to first bringing up the subject with patients, Saba suggests doing so without any judgment.
“We don’t want to make customers feel like we’re judging them,” he says. “They just won’t want to work with us in general if we do that.”
He recommends that therapists approach the topic with genuine curiosity and offers suggested language for these conversations.
“You know, AI is something that’s growing fast, and I’m hearing from a lot of people that they’re using things like ChatGPT for emotional support,” he suggests. “Is that your case? Have you tried this?”
He also recommends asking specific questions about what they find useful so they can better understand how the patient uses these tools.
It can also help the therapist understand whether a chatbot can supplement therapy in useful ways, Insel says, such as considering what topics to include in their sessions or talk about daily life.
In a sense, therapy and chatbots “can be aligned to work together,” Insel says.
Saba and his co-author, William Weeks, also suggest asking patients if they feel any interactions with chatbots are unhelpful or problematic, and also suggest sharing the risks of using chatbots for emotional support.
For example, data privacy risks, as many AI companies use the conversations – even the delicate ones – to further train their models.
There are also risks to treating a chatbot as a therapist, Insel says.
Talking to a chatbot about mental health is “the opposite of therapy,” he says, because chatbots are designed to affirm and flatter, reinforcing users’ thoughts and feelings.
“Therapy is there to help you change and challenge you,” says Insel, “and get you to talk about things that are particularly difficult.”
Taking advice
Psychologist Cami Winkelspecht has a private practice working primarily with children and adolescents in Wilmington, Del.
She is considering adding questions about the use of social media and AI to her intake form and appreciated Saba’s survey because it offered some sample questions to include.

Over the past year or so, Winkelspecht has had a growing number of clients and their parents asking her for help using AI for brainstorming and other tasks in ways that don’t violate the school’s honor code. So, she had to become familiar with the technology to be able to support her clients. Along the way, she realized that therapists and children’s parents needed to be more aware of how children and teens were using their digital devices—both social media and AI chatbots.
