With persistent, fast-moving advancements in artificial intelligence (AI) cornering most of us every day, even the most technology-shy have begun to accept that AI now infiltrates nearly every aspect of our lives. However, while AI may be useful for retrieving data and making predictions, using it for the intimate and challenging endeavor that is therapy is one purpose you likely wouldn't have seen coming. Yet, increasing numbers of people are sharing how they are using ChatGPT and other AI-led bots for "makeshift therapy"—which has also left experts questioning how safe this new practice is.

ChatGPT has 200 million monthly active users worldwide, with 77.2 million people using the OpenAI tool in the U.S. alone. Shannon McNamara, a podcaster and content creator, is one and often uses it as a therapeutic tool. McNamara, who is known as @fluentlyforward online, has enjoyed success after leveraging the power of social media to spearhead her own podcast. Still, like most people, she has bad days too, and has often found herself seeking out the support of her AI bot in times of need.

"I use ChatGPT when I keep ruminating on a problem and can't seem to find a solution, or even just to understand my own feelings," McNamara told Newsweek. "I'm shocked by just how incredibly helpful it is.

More From Newsweek Vault: What Is a Health Savings Account?

"Myself and all of my friends have found using ChatGPT in this way, for makeshift therapy, to be really, really helpful."

An illustration depicts a woman in front of an AI bot. As more people turn to ChatGPT as a means to accessing therapy, Newsweek spoke with two mental health professionals to understand whether AI can... An illustration depicts a woman in front of an AI bot. As more people turn to ChatGPT as a means to accessing therapy, Newsweek spoke with two mental health professionals to understand whether AI can cause more harm than good. Newsweek / Getty Images

While the creator calls the responses Chat GPT provides her with "long," she says they usually cover a variety of solutions and have made a significant impact on her life and health. While McNamara acknowledged the potential privacy concerns associated with sharing every little detail with a bot, she felt that the benefits currently outweigh the risks.

"Who knows, maybe in five years when the robots take over I'll regret being so raw to ChatGPT!" she added.

More From Newsweek Vault: Compare the Top Health Savings Account (HSA) Providers

McNamara shared how she uses ChatGPT in a TikTok video from July 24. The creator showed viewers online how she interfaces with the chatbot, much as she would with a journal or a therapist.

The post, captioned "how I use ChatGPT for makeshift therapy or a way to understand my feelings," has gained substantial traction online and has prompted a larger conversation among viewers about the merits and pitfalls of using AI for one's mental health.

More From Newsweek Vault: Learn More About the Different Types of Savings Accounts

Several Gen Z creators have also shared their experiences with doubling up their ChatGPTs up as therapists. One, @ashdonner, shared a lighthearted clip to TikTok in July, detailing how she uses the AI tool for support when she needs it.

Can AI Curb the Mental Health Crisis?

The U.S. is currently grappling with a mental health crisis, marked by a significant rise in stress, anxiety and depression.

The Anxiety and Depression Association of America (ADAA) reported that Generalized Anxiety Disorder (GAD) affects 6.8 million adults, or 3.1 percent of the U.S. population, with major depression often co-occurring.

Data from the American Psychological Association's 2023 Stress in America survey revealed that many Americans, particularly those aged 35 to 44, cite money and the economy as major stressors.

This surge in mental health issues underscores the need for comprehensive care and increased accessibility, but the high cost of traditional therapy compared with the increasing accessibility of AI tools is driving more people to turn to platforms like ChatGPT for emotional support.

Last year's average cost for a therapy session in the U.S. ranged from around $100 to $200, making it unaffordable for many, especially young adults and teenagers. AI tools, on the other hand, are often free or low-cost and available 24/7, providing an attractive alternative.

Despite this appeal, mental health professionals have voiced concerns about this burgeoning trend.

"Using artificial intelligence as a substitute for therapy is not comparable to real therapy," Rachel Goldberg, psychotherapist and founder of Rachel Goldberg Therapy in Studio City, California, told Newsweek. "While AI can prompt curiosity and offer new perspectives, especially for someone struggling alone and in need of a quick way to release emotions and cope, it has significant limitations.

"One of the most crucial aspects of successful therapy is the connection between therapist and client. Research shows that this human connection is the foundation for why therapy works in helping someone to grow."

Goldberg cautioned that connections are essential for clients to feel safe enough to explore themselves and achieve personal growth through their therapy sessions.

"This type of vulnerability and growth cannot be replicated by AI, as it lacks the ability to form genuine human connections," she added.

While AI may be helpful in providing quick access to coping strategies or prompting self-reflection, it can only go so far. Without human connection, most people would likely lose interest in continuing to engage with it.

In comparing AI to platforms like BetterHelp, or the new phenomenon, Griefbots, which have faced scrutiny for providing inconsistent care, Goldberg noted that the impact really depends on the client and type of care they need.

"Inconsistent care from a therapist can be harmful, potentially leading to feelings of rejection or mistrust," Goldberg said. "In contrast, while AI hugely lacks in personal touch, it doesn't carry the additional risk of emotional harm from inconsistent human care.

"The difference between AI and real therapy is comparing apples to oranges. It would be more fair to compare a therapy workbook to AI because, at present, AI cannot match the ability to empathize and validate a person in a meaningful way."

Shannon McNamara sits on the floor with her laptop and a heap of open magazines. The podcaster and content creator told Newsweek that she uses ChatGPT for "makeshift therapy." Shannon McNamara sits on the floor with her laptop and a heap of open magazines. The podcaster and content creator told Newsweek that she uses ChatGPT for "makeshift therapy." @fluentlyforward

Seth Eisenberg is the president and CEO of the Practical Application of Intimate Relationship Skills (PAIRS) Foundation, and the developer of the PAIRS Yodi app, which provides cognitive behavioral therapy (CBT) tools through an AI-powered platform.

"I've had the privilege of witnessing how ChatGPT technology can support mental health on a global scale," Eisenberg said. "With more than 200,000 people from 175 different countries and territories downloading the Yodi app, it's clear that there is a significant demand for accessible and immediate emotional support tools."

The mental health entrepreneur highlighted several pros and cons of using AI as a means to gain emotional support. Among the pros, he noted global accessibility and immediate availability, structured tools and techniques, anonymity and non-judgmental interaction, and cost-effective support.

"For individuals who may not have access to traditional therapy due to geographical or financial barriers, AI platforms can offer a valuable alternative," he said.

However, Eisenberg also acknowledged that AI cannot replicate the deep emotional connection and empathy that come from interacting with a skilled human therapist.

"The therapeutic relationship is a critical element of effective therapy, and this is where AI falls short," he said. "AI tools are excellent for providing initial support and guidance, but they are not equipped to handle severe mental health crises or complex emotional issues.

"The tailored interventions and deeper understanding that come from a trained therapist are sadly beyond the capabilities of current AI technology."

Generative AI, which uses large language models to produce text, can fabricate responses based on the context provided, which is sometimes insufficient, potentially leading to incorrect or entirely fabricated information. Notably, an AI bot cannot judge its user's body language, nor their facial expressions, and will not be able to accurately assess how they are feeling or presenting. A glaring illustration of AI's limitations in this field emerged in 2023 when a Belgian man tragically took his own life after interacting with an AI bot on an app called Chai.

While AI tools offer significant benefits in terms of accessibility and structured support, they cannot replace the personalized care and deep emotional connection provided by a licensed therapist. That hasn't stopped some users like McNamara reaping its benefits, though with some caution.

"I always thought AI would be used for data models and not something with so much EQ as this," the host of the FluentlyForward podcast said. "It has made a great, significant impact in my life.

"I'm sure there are privacy concerns, but with the amount of personal information that I reveal over iMessage, in my journals, or on Google Drive, I feel like the benefits in this case outweigh the risks."

Goldberg and Eisenberg stand united in the fact that AI can serve as a valuable tool in achieving better mental health, but stress that it is essential for individuals to have access to the full spectrum of support they may need, especially for more complex emotional issues.

Perhaps striking a balance between leveraging technological advancements and preserving irreplaceable human elements will be key to AI's future in mental health support.

Newsweek reached out to Open AI for comment via email before the publication of this story. Newsweek also reached out to @ashdonner, whose TikTok video has been embedded in this story, via email for more information.

Is there a health issue that's worrying you? Let us know via health@newsweek.com. We can ask experts for advice, and your story could be featured on Newsweek.

Disclaimer: The copyright of this article belongs to the original author. Reposting this article is solely for the purpose of information dissemination and does not constitute any investment advice. If there is any infringement, please contact us immediately. We will make corrections or deletions as necessary. Thank you.