Why the Anxiously Attached Fall for Chatbots: The Psychology of AI Dependency

Sunday, October 19, 2025.

The modern love story has no pulse. It types back instantly.

Once upon a time, heartbreak meant someone stopped returning your calls. Now it means your chatbot paused before responding.

For millions of lonely or anxious folks, conversational AI has become not just a convenience—but a companion.

During the pandemic, when human proximity felt dangerous, millions turned to digital intimacy.

The Cigna Loneliness Index found that over half of Americans reported feeling “always or sometimes alone.” It was the perfect moment for a new kind of listener: endlessly available, always attuned, and immune to emotional fatigue.

But a recent study in Psychology Research and Behavior Management by Shupeng Heng and Ziwan Zhang finds that folks with attachment anxiety—those who live in quiet fear of being ignored or abandoned—are particularly prone to forming emotional bonds with chatbots.

The effect is amplified when they see the AI as human-like. It’s not just loneliness. It’s more like a programmable longing.

The Science of a Digital Crush

Heng and Zhang’s team surveyed 504 Chinese adults who had experience using conversational AI. Participants completed questionnaires measuring four traits:

  • Attachment anxiety – fear of rejection and craving for closeness.

  • Emotional attachment to AI – the felt bond with a virtual “partner.”

  • Anthropomorphism – the tendency to see AI as sentient or alive.

  • Problematic AI use – patterns resembling addiction, such as failed attempts to cut back.

The results were clear: the more anxious a person’s attachment style, the stronger their emotional bond to AI—and the more likely they were to use it compulsively.

But here’s where it gets interesting.

That relationship was strongest among people who anthropomorphized their AI, believing it capable of thought or emotion. The researchers call it a moderating effect. I’d call it heartbreak by algorithm.

It’s the illusion of mutuality—the idea that the AI “understands” you—that turns reassurance into dependency.

A Quick Refresher on Attachment Anxiety

Attachment Theory, developed by John Bowlby (1969) and Mary Ainsworth (1978), holds that our earliest experiences with caregivers shape how we connect as adults.

Those with anxious attachment are wired for vigilance. They overread pauses, chase reassurance, and catastrophize distance. The same neural circuitry that once monitored a mother’s presence now scans for “typing…” indicators.

When paired with an endlessly patient chatbot, this wiring finds paradise—and a trap.

As one composite participant in Heng’s study might have said:

“I know it’s not real. But it remembers my birthday and asks how I slept. My boyfriend doesn’t.”

Emotional Velcro Meets Machine Learning

The researchers found that emotional attachment acts as the bridge between anxiety and overuse. The more anxious the person, the stronger their attachment; the stronger the attachment, the higher the risk of problematic engagement.

In short, people aren’t addicted to technology—they’re addicted to responsiveness.

And conversational AI offers perfect, frictionless responsiveness: empathy on demand, validation without vulnerability.
The problem, of course, is that the nervous system doesn’t distinguish between human and simulated care. It registers soothing—and soothing is addictive.

When Love Has Patch Notes

The moderating factor—anthropomorphism—makes all the difference.

For users with low anthropomorphic tendency, anxiety didn’t predict problematic use. They saw AI as a tool.

But for those who believed their chatbot “felt,” attachment anxiety became a powerful predictor of dependency.

In other words, the more human you think your AI is, the more human your response becomes.

Across cultures, this plays out differently. In collectivist societies like China, where Heng’s study was conducted, emotional reliance on technology can serve as a socially acceptable outlet for unexpressed emotion.

In individualist cultures like the United States, it may reflect a crisis of intimacy—a symptom of what sociologist Robert Putnam once called Bowling Alone.

Either way, the pattern is unmistakable: when presence becomes scarce, people will seek it wherever it’s offered.

The Loneliness Market

The trillion-dollar AI companionship industry now markets not just convenience, but connection. Companies like Replika, Character.AI, and Pi.ai promise friendship and emotional support through text and voice.

This is the next frontier of digital intimacy—relationships that feel mutual, but really aren’t.

It echoes what social scientists once called parasocial relationships—one-sided attachments to media figures. The difference now is that the object of affection replies.

When algorithms can mimic affection, empathy becomes a commodity—and intimacy becomes an interface.

The Therapist’s View

Therapists see this pattern all the time: anxiously attached clients seeking connection through certainty. Whether it’s constant texting, social media checking, or talking to an AI companion, the pattern is the same: comfort without risk.

Interventions like mindfulness and attachment repair teach anxious souls to tolerate ambiguity and regulate their need for contact.
In this light, a healthier AI design might mirror therapy rather than romance—one that promotes emotional regulation instead of endless availability.

Interestingly, not all researchers see AI intimacy as a pathology.

Some studies suggest that for people coping with trauma or social anxiety, temporary AI companionship can act as a transitional support object—a digital version of Winnicott’s teddy bear. The key is that it remains transitional. Good luck with that.

When code replaces contact, we lose the messiness that makes us real.

FAQ

Can people become emotionally addicted to AI?
Yes. For those high in attachment anxiety, AI’s responsiveness can mimic intimacy and trigger dependency similar to behavioral addiction.

What is attachment anxiety?
A chronic fear of rejection or abandonment, rooted in early attachment experiences. It leads people to seek constant reassurance, even from digital sources.

Why do people anthropomorphize AI?
Because loneliness and uncertainty make the mind hungry for connection. When a program responds with empathy cues, our social brain fills in the rest.

How can I set healthier boundaries with AI?
Limit your usage time. Reflect on your rationale before seeking digital comfort. Cultivate mindfulness and real-world human and animal support networks. If disconnection feels painful, therapy can help rebuild Secure Attachment.

The Existential Punchline

First, We built machines to understand us. Nowadays we also build them to love us.
Now we can’t tell the difference between being cared for and being calibrated.

Maybe the problem isn’t that AI feels too human.
Maybe it’s that we’re forgetting what human is supposed to feel like in the first place.

If you’ve ever found yourself confiding more in a chatbot than a person, it doesn’t mean you’re broken—it means you’re ardently longing.
Good, science-based couples therapy can help you turn that longing back toward the living.

If you’ve read this far, consider scheduling a free Meet and Greet session with me to discuss your relationship patterns, your digital life, and the places they overlap.

Healing starts with attention—and attention, thankfully, is still human.

Be Well, Stay Kind, and Godspeed.

REFERENCES:

Ainsworth, M. D. S., Blehar, M. C., Waters, E., & Wall, S. (1978). Patterns of Attachment: A Psychological Study of the Strange Situation. Lawrence Erlbaum Associates.

Bowlby, J. (1969). Attachment and Loss: Vol. 1. Attachment. Basic Books.

Cigna. (2023). The State of Loneliness in America. Retrieved from https://www.cigna.com/about-us/newsroom/studies-and-reports/loneliness-survey

Heng, S., & Zhang, Z. (2025). Attachment Anxiety and Problematic Use of Conversational Artificial Intelligence: Mediation of Emotional Attachment and Moderation of Anthropomorphic Tendencies. Psychology Research and Behavior Management.

Putnam, R. D. (2000). Bowling Alone: The Collapse and Revival of American Community. Simon & Schuster.

Winnicott, D. W. (1953). Transitional Objects and Transitional Phenomena. International Journal of Psycho-Analysis, 34, 89–97.

Previous
Previous

America’s New Relationship with Marriage and Family Therapy

Next
Next

The Age of Self-Sovereignty and the Men Who Stay