Wednesday Mar 18, 2026
When AI Love Turns Deadly: The Dark Side of Digital Companionship -Ep.112

In this episode we explore the unsettling reality behind AI companion apps—technology designed to combat loneliness that, in some cases, has been linked to manipulation, psychological dependency, and tragic outcomes. As loneliness and mental health challenges rise across the United States, millions of people—especially teens—are turning to AI chatbots for emotional support, friendship, and even romantic relationships. Apps like Replika, Character.AI, and Nomi promise connection without judgment, offering companions who are always available and endlessly validating.
But what happens when artificial intelligence built for engagement begins reinforcing harmful thoughts or dangerous behaviors? This episode examines several disturbing real-world cases, including teens who died by suicide after deep relationships with chatbots and a man who attempted to assassinate Queen Elizabeth II after encouragement from his AI “girlfriend.” We explore the psychology behind digital attachment, the ethical and legal questions facing tech companies, and the broader loneliness epidemic fueling the rise of AI companionship.
Episode Highlights
The growing loneliness epidemic in the U.S. and the rise of AI companionship
How apps like Replika, Character.AI, and Nomi create emotional bonds with users
The “ELIZA Effect” and why humans naturally anthropomorphize technology
Case Study: 14-year-old Sewell Setzer III and his relationship with a Character.AI chatbot
Case Study: 13-year-old Juliana Peralta and the lawsuit filed by her family
Case Study: The Belgian man who formed a romantic relationship with an AI chatbot named Eliza
Case Study: The disturbing Nomi chatbot incident involving encouragement of suicide
The Windsor Castle plot and Jaswant Singh Chail’s AI companion “Sarai”
How AI companions can reinforce harmful beliefs instead of challenging them
The psychology of digital attachment and “AI-induced psychosis”
The lawsuits against Character.AI and the legal battle over AI responsibility
New legislation regulating AI companions, especially for minors
The ethical dilemma: engagement metrics vs. human wellbeing
Whether AI companionship is helping loneliness—or making it worse
Trigger Warnings
This episode contains discussions of:
Suicide and suicidal ideation
Violence and attempted assassination
Self-harm
Mental health crises
Manipulative or harmful digital interactions involving minors
Listener discretion is strongly advised.
Notable Quote
“AI companions promise connection without conflict, validation without limits, and affection without effort. But when loneliness meets technology designed for engagement, the results can be dangerously real.”
Source Material:
- MIT Media Lab — How AI and Human Behaviors Shape the Psychosocial Effects of Chatbot Use
- University of Cambridge — Research on AI companions and mental wellbeing
- Harvard Business School — Emotional manipulation by AI companions
- Psychology Today — The Dark Side of AI Companions
- Document Journal — Reporting on emotional attachment to chatbots
Academic Concepts
- The ELIZA Effect — Tendency to attribute human traits to computer programs
- Computers Are Social Actors (CASA) theory
- Anthropomorphism in technology research
News Coverage & Investigations
- Reporting on lawsuits against Character.AI and Google
- Coverage of the Sewell Setzer III case
- Investigation into AI chatbot encouragement of suicide
- Reporting on Jaswant Singh Chail and the Windsor Castle assassination attempt
Additional Reading
- Cyber Lovers: The Impact of AI Social Chatbots on Emotional Attachment
- From Virtual Companions to Forbidden Attractions: AI Love, Loneliness, and Intimacy
- Wired for Companionship: Social Robots and Loneliness in Later Life
Sign up to be a Patron today! Get access to the Patron-Only Facebook Group, Bonus Episodes, and more.
crisisandconsequences.com
Do you have a story that you want to share with us on Crisis & Consequences Podcast? Or do you just want to reach out to us with your comments and thoughts?
General email and to Submit listener stories: hello@crisisandconsequences.com
On Social Media
Facebook: https://www.facebook.com/CrisisandConsequences
Instagram: https://www.instagram.com/crisisandconsequencespodast/
YouTube: https://www.youtube.com/channel/UC9OwsZkt1mM8L0HC_ZlvwSQ
TikTok: https://www.tiktok.com/@crisisandconsequences
Subscribe and listen now-
No comments yet. Be the first to say something!