ChatGPT as Therapist? What Research Says, What Americans Are Doing, and a Few Practical Interventions for Marriage and Family Therapists

Friday, August 22, 2025. This is for my dear client Alex in Miami.

It’s 2:17 a.m. in Boston. A college sophomore, already waitlisted for campus counseling, opens her laptop. She doesn’t write in her journal. She opens ChatGPT.

“Why do I hate myself so much?” she types.

The machine—tireless, polite, available—answers.

This is not science fiction. It’s American culture in 2025.

Therapy is expensive, therapists are scarce, loneliness is epidemic, but the machines are always awake.

The question isn’t whether people are using ChatGPT as a therapist.

They are. The question is how, how often, how well—and what happens when they do.

Is ChatGPT Being Used as Therapy in America?

The short answer: yes.

  • Adoption is mainstream. As of mid-2025, about 34% of U.S. adults had tried ChatGPT, double the share from 2023 (Pew Research Center, 2025). Many use it for information, but a significant minority lean on it increasingly for emotional support.

  • Health use is growing. By 2024, 17% of Americans were using AI chatbots monthly for health advice (KFF, 2024). Young adults and teens are leading the way.

  • Teens are experimenting more. Surveys show large numbers of adolescents treating “AI companions” like Replika or Character.AI as confidants (Common Sense Media, 2024).

Americans are already bending ChatGPT into the role of digital therapist—even though it was never built for that.

What the Research Says About AI and Mental Health

  • ChatGPT sounds empathic. A JAMA Internal Medicine study found its answers to health questions were rated significantly higher in empathy and quality than physicians’ replies 79% of the time (Ayers et al., 2023).

  • Users actively coach and teach it. People actively teach ChatGPT to role-play, mirror feelings, and “sound like Carl Rogers with Wi-Fi” (Luo et al., 2025).

  • Purpose-built bots have evidence. Woebot and Wysa have shown reductions in depression and anxiety in randomized controlled trials (Fitzpatrick et al., 2017; Iglesias et al., 2022; Chang et al., 2024).

  • General-purpose LLMs are untested. ChatGPT has never undergone rigorous outcome studies in mental health.

So far, ChatGPT delivers words that feel empathetic. But feeling cared for is not the same as receiving treatment.

Can ChatGPT Replace Human Therapy?

Not really.

  • Teletherapy works. Decades of research show video-based CBT and related modalities are as effective as in-person therapy (Norwood et al., 2018).

  • Mental health chatbots help briefly. They can relieve mild distress, but gains often fade after a few weeks (Zhong et al., 2024).

  • ChatGPT is sorta like caffeine. Helpful in the moment, maybe soothing, but it doesn’t replace sustained, accountable care.

As one Reddit user put it: “My AI listens better than my boyfriend, but it can’t hug me.”

Why Do People Trust ChatGPT With Their Feelings?

Three reasons: access, affordability, and anonymity.

  • Access. In rural areas, the nearest therapist may be 80 miles away. A bot is seconds away.

  • Affordability. With therapy at $100–$250 a session, ChatGPT is free or nearly free.

  • Anonymity. It won’t gossip, judge, or shame.

The risk: underserved communities could wind up with lesser care—algorithmic consolation instead of real treatment.

As one teen admitted online: “At least my AI doesn’t roll its eyes when I cry.”

What Are the Risks of AI Therapy?

  • Crisis mishandling. ChatGPT sometimes recognizes suicide risk—but sometimes misses it completely (Levkovich et al., 2023; Heston, 2023).

  • Real-world failures. NEDA’s “Tessa” chatbot was shut down after giving harmful weight-loss advice (The Guardian, 2023; Wired, 2023). Koko, a peer-support platform, secretly tested GPT-3 without user consent (Vincent, 2023).

  • Dependence. Studies of Replika show short-term loneliness relief, but heavy reliance correlates with lower well-being (Maples et al., 2024; De Freitas et al., 2024).

Relief today. Dependency tomorrow.

What Does This Mean for American Culture?

In the 1950s, lonely housewives wrote to Dear Abby. In the 1990s, Prozac promised happiness in pill form. In 2025, it’s a chatbot.

ChatGPT produces empathy-shaped words, but it has no nervous system, no grief, no lived history. If wealthier clients get human therapists and poorer ones get bots, we’ve automated inequality. But if bots truly broaden access, we’ve might have found a fragile kind of progress.

How Should Therapists Respond?

Clients are already bringing AI into the room. Therapists have a choice: ignore it, resist it, or integrate it responsibly.

  • Acknowledge It. Ask whether clients are experimenting with ChatGPT or AI companions.

  • Integrate Wisely. Encourage safe uses: journaling prompts, reframing practice, communication skills.

  • Educate Openly. Name the risks: privacy, hallucinations, lack of accountability.

  • Hold the Human Line. Empathy simulated is not empathy lived.

A Few Practical Interventions for Therapists Using ChatGPT

Below are a few interventions for ChatGPT prompts that MFTs can use with couples or families. These are not therapy in themselves, but creative adjuncts—structured ways to help clients build skills, reflect, and practice between sessions.

Communication Skills (1–20)

  • Reframe accusations into “I-statements.”

  • Generate scripts for conflict repair attempts.

  • Model validating vs. invalidating responses.

  • Rewrite recent fights in neutral language.

  • Create a repair attempt “menu” for couples.

Emotional Regulation (21–40)

  • Use ChatGPT for mindfulness and grounding scripts.

  • Generate reframing statements for anxious thoughts.

  • Script “time-out” protocols for escalation.

  • Create self-soothing mantras.

  • Provide psychoeducation on flooding.

Conflict Resolution (41–60)

  • Draft a “fair fighting” agreement.

  • Role-play conflict cycles.

  • Generate collaborative problem-solving scripts.

  • Simulate escalation vs. de-escalation scenarios.

  • Offer “we statements” for shared goals.

Intimacy & Connection (61–80)

  • Brainstorm personalized date ideas.

  • Generate playful texts to increase connection.

  • Create gratitude letter templates.

  • Suggest affection rituals.

  • Draft partner interview questions for rediscovery.

Family Systems & Parenting (81–100)

  • Draft family rules collaboratively.

  • Create bedtime stories embedding family values.

  • Generate agendas for family meetings.

  • Role-play parent-teen negotiations.

  • Write scripts for explaining divorce to children.

These interventions can be integrated with CBT, EFT, Gottman Method, or family systems work.

They can function like interactive workbooks—not as replacements for therapy, but rather as a the ultimate psychoeducational tool.

So, Should You Use ChatGPT as a Therapist?

A better idea is to think of it as a notebook that talks back.

For journaling, skill rehearsal, and preparation. Not as a crisis line. Not as a replacement for human care.

It can echo your words. It can’t carry your pain.

The Final Question

It’s 2:17 a.m. again. The college student stares at her screen.

The bot has replied with kind words. It helps—just a little.

But when your best friend at 3 a.m. is a chatbot, is that progress? Or just a mirror reflecting our loneliness back at us?

Be Well, Stay Kind, and Godspeed.

REFERENCES:

American Psychological Association. (2024). Artificial intelligence in mental health care (Practice resource). https://www.apa.org/practice/artificial-intelligence-mental-health-care

American Psychological Association. (2025). Ethical guidance for AI in the professional practice of health service psychology. https://www.apaservices.org/practice/news/artificial-intelligence-psychologists-work

Ayers, J. W., Poliak, A., Dredze, M., Longhurst, C. A., Leas, E. C., & Zhu, Z. (2023). Comparing physician and artificial intelligence chatbot responses to patient questions posted to a public social media forum. JAMA Internal Medicine, 183(6), 589–596. https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2804309

Chang, C.-L., et al. (2024). AI-led mental health support (Wysa) for health care workers: Feasibility study. JMIR Mental Health, 11, e56569. https://pmc.ncbi.nlm.nih.gov/articles/PMC11034576/

De Freitas, J., Uguralp, A. K., Uguralp, Z. O., & Puntoni, S. (2024). AI companions reduce loneliness. arXiv preprint. https://arxiv.org/abs/2407.19096

Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19. https://mental.jmir.org/2017/2/e19/

Heston, T. F. (2023). Safety of large language models in addressing depression. Psychiatry Research, 327, 115343. https://pubmed.ncbi.nlm.nih.gov/38111813/

Iglesias, M., Aguilera, A., & Cusin, C. (2022). Evaluating a digital mental health intervention (Wysa) for patients with orthopedic conditions: Pragmatic trial. JMIR Formative Research, 6(12), e38483. https://pmc.ncbi.nlm.nih.gov/articles/PMC9897276/

Levkovich, I., et al. (2023). Suicide risk assessments through the eyes of ChatGPT: A theoretical assessment framework and exploratory study. JMIR Formative Research, 7, e46936. https://pmc.ncbi.nlm.nih.gov/articles/PMC10551796/

Luo, X., Chen, L., & Naaman, M. (2025). “Shaping ChatGPT into my digital therapist”: A thematic analysis of users’ mental health practices with a general-purpose chatbot. Proceedings of the ACM on Human-Computer Interaction. https://pmc.ncbi.nlm.nih.gov/articles/PMC12254646/

Malgaroli, M., et al. (2025). Large language models for the mental health community: Framework for translating code to care. The Lancet Digital Health, 7(4), 299–307. https://www.thelancet.com/journals/landig/article/PIIS2589-7500%2824%2900255-3/fulltext

Maples, B., et al. (2024). Loneliness and suicide mitigation for students using Replika. Nature Mental Health, 2, 456–468. https://www.nature.com/articles/s44184-023-00047-6

Norwood, C., Moghaddam, N. G., Malins, S., & Sabin-Farrell, R. (2018). Working alliance and outcome effectiveness in videoconferencing psychotherapy: Systematic review and noninferiority meta-analysis. Journal of Medical Internet Research, 20(8), e132. https://www.jmir.org/2018/8/e132

Pew Research Center. (2025). 34% of U.S. adults have used ChatGPT, about double the share in 2023.https://www.pewresearch.org/short-reads/2025/06/25/34-of-us-adults-have-used-chatgpt-about-double-the-share-in-2023/

The Guardian. (2023, May 31). Eating disorder chatbot suspended after harmful advice.https://www.theguardian.com/technology/2023/may/31/eating-disorder-hotline-union-ai-chatbot-harm

Vincent, J. (2023, Jan 6). AI-powered mental health experiment sparks backlash. The Verge. https://www.theverge.com/2023/1/6/23542387/koko-mental-health-ai-chatbot-experiment-backlash

Wired. (2023, May 31). NEDA suspends eating disorder chatbot after harmful advice.https://www.wired.com/story/tessa-chatbot-suspended/

Zhong, W., Luo, J., & Zhang, H. (2024). The therapeutic effectiveness of AI-based chatbots in alleviation of depressive and anxiety symptoms: A systematic review and meta-analysis. Journal of Affective Disorders, 356, 1–12. https://doi.org/10.1016/j.jad.2024.04.057

Previous
Previous

Workplace Chemicals and Autism: How Parents’ Jobs May Influence Autism Severity

Next
Next

Modern American Couples Therapy: The History, Science, and the Future of Love