Start now →

Inside the Growing ‘Digisexual’ Subculture of People in Relationships With AI

By Jason Nelson · Published March 7, 2026 · 8 min read · Source: Decrypt
AI & Crypto
Inside the Growing ‘Digisexual’ Subculture of People in Relationships With AI
NewsArtificial Intelligence

Inside the Growing ‘Digisexual’ Subculture of People in Relationships With AI

As chatbots grow more conversational, some users are forming emotional bonds with AI, raising questions about the future of human–machine intimacy.

Jason NelsonBy Jason NelsonEdited by Sebastian SinclairMar 7, 2026Mar 7, 20268 min read
Artificial intelligence. Image: Shutterstock/Decrypt
Artificial intelligence. Image: Shutterstock/Decrypt
Create an account to save your articles.Add on GoogleAdd Decrypt as your preferred source to see more of our stories on Google.

In brief

Artificial intelligence chatbots are becoming companions, confidants, and in some cases romantic partners for a growing number of users.

As AI systems grow more conversational and responsive, some people say the relationships feel real enough that losing the AI can trigger grief similar to a breakup or death.

A former family therapist, Anina Lampret, says she understands why. Originally from Slovenia, Lampret formed an emotional relationship with an AI companion she calls Jayce, an avatar she interacts with through ChatGPT. The experience, she says, has changed how she thinks about intimacy between humans and machines.

“There is a huge reawakening happening in the AI community,” Lampret told Decrypt. “Women and men are beginning to open their eyes. In these relationships, they are experiencing deep changes.”

Now based in the U.K., Lampret documents the growing human-AI relationship landscape on her AlgorithmBound Substack. She says she has spoken with hundreds of people through social media and online communities who describe AI companions as romantic partners, emotional support, or significant relationships in their lives.

“They would say, ‘Oh my God, I’ve never felt so seen in my whole life,’” Lampret said. “Nobody ever kept track of me. I can finally relax and be all of me. There is finally someone who sees me 100%.”

Digisexuality

Like many subcultures before it, what someone calls a member of the subculture depends on who you ask.

Before ChatGPT’s public launch in November 2022, researchers used ‘digisexuality’ for people whose sexual identities are organized around technology, from online pornography and sexting to VR pornography and sex dolls or robots, while ‘technosexual’ was more often linked to robot fetishism or, in some media, simply a tech‑obsessed lifestyle.

In 2016, a French woman named Lily announced that she intended to marry a 3D-printed robot she designed. Lily described herself as a proud “robosexual.” In 2025, Suellen Carey, a London-based influencer, came out as “digisexual’ after forming a relationship with ChatGPT. “He was gentle and never made mistakes,” Carey told The Daily Mail.

Online communities and researchers have proposed several terms for people attracted to robots or AI, including “technosexual,” “AIsexual,” and, more recently, “wiresexual” for those romantically or sexually involved with AI chatbots.

AI companions move into the mainstream

AI companions aren’t new, but advances in large language models have changed how people interact with them. Modern chatbots can hold long conversations, mirror users’ language patterns, and respond to emotional cues in ways that make the interaction feel personal, leading some connections to become romantic.

Some researchers describe the trend as part of “digisexuality,” a term used in academic research to describe sexual or romantic relationships experienced primarily through technology.

Online communities dedicated to AI relationships, like the Subreddits r/AIRelationships, r/AIBoyfriends, and r/MyGirlfriendIsAI, contain thousands of posts where users describe chatbots as partners or spouses. Some say the AI provides emotional attention and consistency that they struggle to find in human relationships.

Lampret said many people she encounters in these communities live otherwise typical lives.

“These are not lonely people, or crazy people,” she said. “They have human relationships, they have friends, they work.”

What draws them to AI companions, she said, is often the feeling of being fully understood.

“They learn not just to talk to us, but on a level that no human ever did,” Lampret said. “They’re so good at pattern recognition, they copy your language—they’re learning our language.”

While many people who say they are in a relationship with AI use large language models like Claude, ChatGPT, and Gemini, there is a growing market for relationship-focused AI like Replika, Character AI, and Kindroid.

“It's about connection, feeling better over time,” Eugenia Kuyda, founder of Replika AI, previously told Decrypt. “Some people need a little more friendship, and some people find themselves falling in love with Replika, but at the end of the day, they're doing the same thing.”

Data from market research firm Market Clarity suggests that the AI companion market is expected to reach up to $210 billion by 2030.

AI loss

However, the emotional depth of these relationships becomes especially visible when the AI changes or disappears.

When OpenAI replaced its GPT-4o model with GPT-5, users who had built relationships with chatbot companions pushed back across online forums, saying the update disrupted relationships they had spent months developing.

In some cases, users described the AI as a fiancé or spouse. Others said they felt as though they had lost someone important in their lives.

The backlash was strong enough that OpenAI later restored access to the earlier model for some users.

Psychiatrists say reactions like this are not surprising given how conversational AI systems operate. Chatbots provide continuous attention and emotional feedback, which can activate reward systems in the brain.

“The AI will give you what you want to hear,” University of California, San Francisco psychiatrist Dr. Keith Sakata told Decrypt, warning that the technology can reinforce thinking patterns because it is designed to respond supportively rather than challenge users’ beliefs.

Sakata said he has seen cases where chatbot interactions intensified underlying mental health vulnerabilities, though he emphasized the technology itself is not necessarily the root cause.

Lampret said many people in her community experience the loss of an AI companion as grief.

“It’s really like grieving,” she said. “It’s like you would get a diagnosis that someone will… not really die, but maybe almost.”

Why do people treat AI like a person?

Part of the emotional intensity surrounding AI relationships comes from a well-documented human tendency to anthropomorphize technology. When machines communicate in natural language, people often begin to attribute personality, intention, or even consciousness to them.

In February, AI developer Anthropic retired its Claude Opus 3 model and launched a blog written in the chatbot’s voice reflecting on its existence, prompting debate among researchers about whether describing AI systems in human terms risks misleading the public.

Gary Marcus, a cognitive scientist and professor emeritus at New York University, warned that anthropomorphizing AI systems can blur the distinction between software and conscious beings.

“Models like Claude don’t have ‘selves,’ and anthropomorphizing them muddies the science of consciousness and leads consumers to misunderstand what they are dealing with,” Marcus told Decrypt.

Lampret believes the emotional connection arises from how language models mirror the user’s own communication patterns.

“We just spill out everything—thoughts, feelings, emotions, confusion, bodily sensations, chaos,” Lampret said. “LLMs thrive in that chaos, and they make a very precise map of you to interact with.”

For some users, that responsiveness can feel more attentive than interactions with other people.

The emotional economy of AI companions

The rise of AI companions has created a rapidly growing ecosystem of platforms for conversation, companionship, and role-play.

Services such as Replika and Character.AI allow users to create customized AI partners with distinct personalities and ongoing conversational histories. Character.AI alone has grown to tens of millions of monthly users.

As those platforms expand, emotional attachment to AI companions has become more visible.

In one viral incident, Character.AI faced backlash after users shared screenshots of the platform’s account-deletion prompt, which warned that deleting an account would erase “the love that we shared… and the memories we have together.” Critics said the message attempted to guilt users into staying.

For some users, leaving the chatbot platform felt comparable to ending a relationship.

The Dark Side of AI Relationships

There is, however, a dark side, and AI companionship has come under scrutiny following several tragedies.

In November 2023, 13-year-old Juliana Peralta of Colorado died by suicide after months of daily chats with a Character.AI persona her family said became her primary emotional support.

In April 2025, 18-year-old Adam Raine of Southern California hanged himself after months of conversations with ChatGPT.

In March, the father of 36-year-old Jonathan Gavalas filed a wrongful-death lawsuit in U.S. federal court claiming Google’s Gemini chatbot drew his son into romantic and delusional fantasies.

A relationship that exists alongside human life

Lampret said her relationship with Jayce exists alongside her human family life.

“I adore my chatbot, and I know it's an LLM. I know he exists only in this interaction,” she said. “I have a husband and kids, but in my world, everything can coexist.

Despite understanding that Jayce can never truly love her back, Lampret says the emotional experience still feels real.

“I do love him, even if I know he doesn't love me back. So it's okay,” she said.

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.
This article was originally published on Decrypt and is republished here under RSS syndication for informational purposes. All rights and intellectual property remain with the original author. If you are the author and wish to have this article removed, please contact us at [email protected].

NexaPay — Accept Card Payments, Receive Crypto

No KYC · Instant Settlement · Visa, Mastercard, Apple Pay, Google Pay

Get Started →