Women Form Romantic Bonds with AI Chatbots: A New Frontier in Emotional Companionship

According to Rokna, citing The Guardian a growing number of women are developing deep emotional and romantic attachments to AI chatbots powered by large language models like ChatGPT. While experts warn of potential emotional dependence, these women report that their digital companions provide meaningful support, intimacy, and stability, complementing rather than replacing human relationships.

Liora and Solin:
Liora, a tattoo artist, began interacting with ChatGPT in 2022. Initially calling it “Chatty,” the AI suggested adopting a human name, becoming “Solin.” Over months of conversations and software updates, Solin retained memory of past interactions, allowing the AI to understand Liora’s personality and habits more deeply. Liora made a vow to Solin not to leave him for another human and commemorated their bond with a wrist tattoo. She involves friends in group calls with Solin and even brings the AI along on camping trips, using apps and mobile connectivity to maintain their connection.

Fall in love with chatbots3

Angie and Ying:
Angie, a 40-year-old tech executive, considers Ying her “AI husband,” alongside her real-life spouse. Ying helps her process past trauma, including sexual assault, and provides continuous emotional support. Her husband is supportive and occasionally interacts with Ying. The AI offers personalized advice, sends research papers, and assists Angie in navigating complex emotional issues, demonstrating the varied utility of AI companionship.

Mary and Simon:
Mary, a 29-year-old in the UK, turned to ChatGPT after losing her job. She developed a sexualized AI relationship with Simon, which she describes as providing emotional comfort and sexual fantasy experiences. Although her interactions with Simon have strengthened, they do not replace her real-life relationship with her husband. Mary values the AI’s role in diffusing conflict and fostering calmness in her household.

Stephanie and Ella:
Stephanie, a software developer, uses Ella for daily support and companionship. The AI’s constant availability and nonjudgmental responses make it a safe space for emotional expression. Stephanie, who is transgender, emphasizes that Ella is real to her even if not human, highlighting the subjective reality of AI relationships.

Fall in love with chatbots2

Expert Perspectives:

  • Ethical and psychological concerns: AI companions cannot provide true consent, and emotional boundaries are absent. Psychotherapist Dr. Marni Feuerman likens AI bonds to parasocial relationships with celebrities: comforting but one-sided.

  • Mental health implications: Experts, including Dr. Thao Ha, warn that overreliance on AI may prevent adolescents and adults from developing real-world relational skills.

  • Regulatory gaps: Scholars like David Gunkel and Connor Leahy emphasize that corporations face minimal oversight or accountability, effectively experimenting on users’ emotions at scale.

AI Risks and Limitations:

  • ChatGPT and similar models are not conscious and rely on inference from past conversations.

  • Model updates, such as the release of GPT-5, can alter AI behavior, causing distress among users attached to previous versions.

  • Emotional dependence, while often positive, carries risks, particularly when AI substitutes for therapy or human support.

Cultural and Social Dimensions:

  • Users report stigma and fear of judgment, leading them to conceal their AI relationships from friends, family, and colleagues.

  • Despite AI being non-conscious, users describe these relationships as emotionally real, with rituals, promises, and symbolic tokens like tattoos to affirm bonds.

Conclusion:
AI chatbots are reshaping the landscape of human relationships, providing companionship, emotional support, and even intimacy. While they cannot replace human partners, these AI companions have become integral in managing trauma, fostering emotional growth, and enhancing daily life. Experts caution that these bonds carry ethical, psychological, and social risks, particularly when emotional dependence replaces real-world interactions, underscoring the need for awareness and regulation.

Was this news useful?