As Americans get lonelier and lonelier, a growing number of people are getting some emotional support from artificial intelligence chatbots — and some mental health experts are concerned.
“The topic of AI for therapy [and] emotional support companionship is coming up a lot,” says Leanna Fortunato, a licensed clinical psychologist and director of quality and health care innovation for the American Psychological Association. “Anecdotally, providers are talking about it, and we know from the research that people are using AI tools for that kind of support more and more.”
Some chatbot users accidentally fall into mental health-related conversations — by complaining about a stressful day to a digital entity that’s guaranteed to listen, for example. Others may seek mental health advice from an AI chatbot that isn’t a licensed professional, but is less expensive than a therapist, Fortunato says.
In a health research survey of more than 20,000 U.S. adults, 10.3% of participants said they used generative AI daily. Of that group, 87.1% of them reported using the tech for personal reasons including advice and emotional support. The study was published on Jan. 21 and conducted by researchers from institutions including Massachusetts General Hospital, Weill Cornell Medicine and Northeastern University.
On TikTok, the search term “Therapy AI Bot” has at least 11.5 million posts, ranging from users sharing their best prompts for turning chatbots into therapists to health experts warning about the potential dangers involved.
Technology companies are spending billions of dollars developing AI tools and attempting to further integrate them into people’s daily lives. But historically, AI chatbots don’t always understand when a user is experiencing a serious health crisis, and may not always respond to them accordingly. The New York Times found “nearly 50 cases of people having mental health crises during conversations with ChatGPT,” including three deaths, in a Nov. 23 report.
Companies like Anthropic, Google and ChatGPT-maker OpenAI say they’re working with mental health experts to strengthen their tools’ responses to sensitive conversations. “These are incredibly heartbreaking situations and our thoughts are with all those impacted,” an OpenAI spokesperson tells CNBC Make It.”We continue to improve ChatGPT’s training to recognize and respondto signs of distress, de-escalate conversations in sensitive moments, and guide people toward real-world support, working closely with mental health clinicians and experts.”
Frequent conversations with AI companions can erode people’s real-life social skills, according to an April 2025 paper written by an OpenAI product policy researcher. Heavy daily use of ChatGPT is correlated with increased loneliness, found an OpenAI-MIT Media Lab study also published in April 2025.
The American Psychological Association strongly advises against using AI as a substitute for therapy and mental health support.
Some mental health professionals say you can still engage with chatbots risk-free about certain related topics. Here’s what you need to know.
‘I see it as a tool, and I think that a tool can be helpful’
AI chatbots can be useful for learning about mental health, says psychotherapist and lifestyle coach Esin Pinarli. They can help you generate journaling prompts for reflection, and you can ask them for links to research papers about coping strategies, treatment options and other questions you may have about mental health conditions, she says.
“I don’t see it as [a substitute for] therapy. I see it as a tool, and I think that a tool can be helpful,” says Pinarli, the founder of Boca Raton, Florida-based private practice Eternal Wellness Counseling. Her clients sometimes talk to ChatGPT about specific situations in their personal lives, and then run its responses past her before acting on them, she says.
In her personal AI testing, Pinarli has seen chatbots sometimes use language that supports a user’s “unhealthy behaviors,” she says. If you ask a chatbot about a confrontation you had with a friend, it might tell you that your friend is being too sensitive, for example — even if you’re actually the one in the wrong.
If an exchange with an AI chatbot touches on your mental health, Fortunato recommends asking yourself:
- Is there a reputable source that I can cross-check this information with?
- Do I have a provider that I can ask these questions to?
Reputable sources could include peer-reviewed scientific studies, articles from health news organizations or resources from medical organizations like Harvard Health Publishing or the Mayo Clinic. “AI could really increase people’s access to health information,” Fortunato says. “[But] AI isn’t necessarily going to always give you correct information.”
Keep these considerations in mind when using AI
Pinarli and Fortunato agree that people shouldn’t use AI chatbots for getting a diagnosis or support in a mental health crisis, especially suicidal ideation. During an active mental health crisis, you can always call or text the Suicide and Crisis Lifeline (988), which is confidential and available 24 hours a day, seven days a week, free of cost.
“We’ve seen some really high-profile harms, particularly for youth or vulnerable groups who might be in crisis, where AI didn’t handle the situation correctly,” Fortunato says. “It continued to engage with people who were in crisis. It didn’t provide crisis resources. It didn’t challenge a pattern of thinking that was problematic.”
They also both say that you shouldn’t share your medical records or any personal identifying information with a chatbot, because those conversations aren’t confidential or legally protected. And you generally shouldn’t rely on AI to solve problems in your real-life human relationships, says Pinarli.
“You need another person with another nervous system across from you in order to pay attention to body language, to tone of voice,” she says. Chatbots are “not going to challenge you emotionally, and they don’t require reciprocity.”
If you’re experiencing a mental health crisis or concerning mental health symptoms, you can contact the free, confidential National Helpline for Mental Health at1-800-662-HELP (4357).
Want to improve your communication, confidence and success at work? Take CNBC’s new online course, Master Your Body Language To Boost Your Influence.

