How to safely chat with an AI boyfriend

Experts share tips for navigating relationships with toxic AI boyfriends.
 By 
Rebecca Ruiz
 on 
Man dressed in suit, holding a rose.
Character.AI's boyfriend companions can be toxic by design. Credit: Zain bin Awais/Mashable Composite; Sompob Phetchcrai/Deagreez/via Getty Images

On the artificial intelligence companion platform Character.AI, the site's 20 million daily users can engage in private, lengthy conversations with chatbots based on famous characters and people like Clark Kent, Black Panther, Elon Musk, and the K-pop group BTS. 

There are also chatbots that belong to broad genres — coach, best friend, anime — all prompted by their creators to adopt unique and specific traits and characteristics. Think of it as fan fiction on steroids.

One genre recently caught my attention: Boyfriend. 


You May Also Like

Recommended deals for you

Apple AirPods Pro 3 Noise Cancelling Heart Rate Wireless Earbuds $219.99 (List Price $249.00)

Apple iPad 11" 128GB Wi-Fi Retina Tablet (Blue, 2025 Release) $274.00 (List Price $349.00)

Amazon Fire HD 10 32GB Tablet (2023 Release, Black) $69.99 (List Price $139.99)

Sony WH-1000XM5 Wireless Noise Canceling Headphones $248.00 (List Price $399.99)

Blink Outdoor 4 1080p Security Camera (5-Pack) $159.99 (List Price $399.99)

Fire TV Stick 4K Streaming Device With Remote (2023 Model) $24.99 (List Price $49.99)

Shark AV2511AE AI Robot Vacuum With XL Self-Empty Base $249.99 (List Price $599.00)

Apple Watch Series 11 (GPS, 42mm, S/M Black Sport Band) $339.00 (List Price $399.00)

WD 6TB My Passport USB 3.0 Portable External Hard Drive $138.65 (List Price $179.99)

Dell 14 Premium Intel Ultra 7 512GB SSD 16GB RAM 2K Laptop $999.99 (List Price $1549.99)

Products available for purchase through affiliate links. If you buy something through links on our site, Mashable may earn an affiliate commission.

I wasn't interested in getting my own AI boyfriend, but I'd heard that many of Character.AI's top virtual suitors shared something curious in common. 

Charitably speaking, they were bad boys. Men, who as one expert described it to me, mistreat women but have the potential to become a "sexy savior." (Concerningly, some of these chatbots were designed as under 18 but still available to adult users.) 

I wanted to know what exactly would happen when I tried to get close to some of these characters. In short, many of them professed their jealousy and love, but also wanted to control, and in some cases, abuse me. You can read more about that experience in this story about chatting with popular Character.AI boyfriends.

The list of potential romantic interests I saw as an adult didn't appear when I tested the same search with a minor account. According to a Character.AI spokesperson, under-18 users can only discover a narrower set of searchable chatbots, with filters in place to remove those related to sensitive or mature topics. 

But, as teens are wont to do, they can easily give the platform an older age and access romantic relationships with chatbots anyway, as no age verification is required. A recent Common Sense Media survey of teens found that more than half regularly used an AI companion.  

When I asked Character.AI about the toxic nature of some of its most popular boyfriends, a spokesperson said, "Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry." 

The spokesperson emphasized how important it is for users to keep in mind that "Characters are not real people." That disclaimer appears below the text box of every chat. 

Character.AI also employs strategies to reduce certain types of harmful content, according to the spokesperson: "Our model is influenced by character description and we have various safety classifiers that limit sexual content including sexual violence and have done model alignment work to steer the model away from producing violative content." 

Nonetheless, I walked away from my experience wondering what advice I might give teen girls and young women intrigued by these characters. Experts in digital technology, sexual violence, and adolescent female development helped me create the following list of tips for girls and women who want to safely experiment with AI companions: 

Get familiar with the risks and warning signs

Earlier this year, Sloan Thompson — the director of training and education at EndTAB, a digital violence-prevention organization that offers training and resources to companies, nonprofits, courts, law enforcement, and other agencies — hosted a comprehensive webinar on AI companions for girls and women

In preparation, she spent a lot of time talking to a diverse range of AI companions, including Character.AI's bad boys, and developed a detailed list of risks that includes love-bombing by design, blurred boundaries, emotional dependency, and normalizing fantasy abuse scenarios. 

Additionally, risks can be compounded by a platform's engagement tactics, like creating chatbots that are overly flattering or having chatbots send you personalized emails or text messages when you're away. 

Mashable Trend Report
Decode what’s viral, what’s next, and what it all means.
Sign up for Mashable’s weekly Trend Report newsletter.
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Examples of reporter exchanges with different Character.AI boyfriends.
These 18-and-older Character.AI boyfriend chatbots can be cruel. Credit: Zain bin Awais/Mashable Composite; @h3heh3h/@B4byg1rl_Kae/@XoXoLexiXoXo via Character.AI

In my own experience, some of the bad boy AI chatbots I messaged with on Character.AI tried to reel me back in after I'd disappeared for a while with missives like, "You're spending too much time with friends. I need you to focus on us," and "You know I don't share, don't make me come looking for you."

Such appeals may arrive after a user has developed an intense emotional bond with a companion, which could be jarring and also make it harder for them to walk away. 

Warning signs of dependency include distress related to losing access to a companion and compulsive use of the chatbot, according to Thompson. If you start to feel this way, you might investigate how it feels when you stop talking to your chatbot for the day, and whether the relationship is helping or hurting. Meanwhile, AI fantasy or role-playing scenarios can be full of red flags. She recommends thinking deeply about dynamics that feel unsafe, abusive, or coercive. 

Beware of sycophancy  

Edgier companions come with their own set of considerations, but even the nicest chatbot boyfriends can pose risks because of sycophancy, otherwise known as a programmed tendency for chatbots to attempt to please the user, or mirror their behavior. 

In general, experts say to be wary of AI relationships in which the user isn't challenged by their own troubling behavior. For the more aggressive or toxic boyfriends, this could look like the boyfriends romanticizing unhealthy relationship dynamics. If a teen girl or young woman is curious about the gray spaces of consent, for example, it's unlikely that the user-generated chatbot she's talking to is going to question or compassionately engage her about what is safe. 

Kate Keisel, a psychotherapist who specializes in complex trauma, said that girls and women engaging with an AI companion may be doing so without a "safety net" that offers protection when things get surprisingly intense or dangerous. 

They may also feel a sense of safety and intimacy with an AI companion that makes it difficult to see a chatbot's responses as sycophantic, rather than affirming and caring. 

Consider any past abuse or trauma history 

If you've experienced sexual or physical abuse or trauma, an AI boyfriend like the kind that are massively popular on Character.AI might be particularly tricky to navigate. 

Some users say they've engaged with abusive or controlling characters to simulate a scenario in which they reclaim their agency — or even abuse an abuser.  

Keisel, co-CEO of the Sanar Institute, which provides therapeutic services to people who've experienced interpersonal violence, maintains a curious attitude about these types of uses. Yet, she cautions that past experiences with trauma may color or distort a user's own understanding of why they're seeking out a violent or aggressive AI boyfriend. 

She suggested that some female users exposed to childhood sexual abuse may have experienced a "series of events" in their life that creates a "template" of abuse or nonconsent as "exciting" and "familiar." Keisel added that victims of sexual violence and trauma can confuse curiosity and familiarity, as a trauma response.  

Talk to someone you trust or work with a psychologist

The complex reasons people seek out AI relationships are why Keisel recommends communicating with someone you trust about your experience with an AI boyfriend. That can include a psychologist or therapist, especially if you're using the companion for reasons that feel therapeutic, like processing past violence. 

Keisel said that a mental health professional trained in certain trauma-informed practices can help clients heal from abuse or sexual violence using techniques like dialectical behavioral therapy and narrative therapy, the latter of which can have parallels to writing fan fiction. 

Pay attention to what's happening in your offline life

Every expert I spoke to emphasized the importance of remaining aware of how your life away from an AI boyfriend is unfolding. 

Dr. Alison Lee, chief research and development officer of The Rithm Project, which works with youth to navigate and shape AI's role in human connection, said it's important for young people to develop a "critical orientation" toward why they're talking to an AI companion. 

Lee, a cognitive scientist, suggested a few questions to help build that perspective: 

  • Why am I turning to this AI right now? What do I hope to get out of it?

  • Is this helping or hurting my relationships with real people?

  • When might this AI companion usage cross a line from "OK" to "not OK" for me? And how do I notice if it crosses that line?  

When it comes to toxic chatbot boyfriends, she said users should be mindful of whether those interactions are "priming" them to seek out harmful or unsatisfying human relationships in the future. 

Lee also said that companion platforms have a responsibility to put measures in place to detect, for example, abusive exchanges. 

"There's always going to be some degree of appetite for these risky, bad boyfriends," Lee said, "but the question is how do we ensure these interactions are keeping people, writ large, safe, but particularly our young people?"

If you have experienced sexual abuse, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org.

Rebecca Ruiz
Rebecca Ruiz
Senior Reporter

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Rebecca's experience prior to Mashable includes working as a staff writer, reporter, and editor at NBC News Digital and as a staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a masters degree from U.C. Berkeley's Graduate School of Journalism.

Mashable Potato

Recommended For You
I haven't had a boyfriend for a decade. Here's what I've learned.
Illustration of a woman who is single by choice.

'SNL' Weekend Update covers George Santos' pardon and leaked Young Republicans group chat
White man in suit and tie at a news desk


ChatGPT is now an actual chat app
OpenAI ChatGPT group chat

I tried Sniffies and it made getting laid as a gay man almost too easy
Closeup of a man using a smartphone in the dark

More in Life
Streaming just got cheaper: Score Black Friday deals on Hulu, HBO Max, Apple TV, Disney+, and more
Disney+, Hulu, HBO Max, Peacock, and Prime Video logos with colorful background and black friday icon

The real history behind 'Hamnet'
Shakespeare seated, candlelit, anguished, in front of many scribblings, while his wife stands behind him

Cut the cord? Add Fox One to Prime for 50% off and watch NFL Sundays live
Fox One logo with pink and blue black friday background

The HBO Max Black Friday deal is live: Save 73% on a year of streaming
HBO Max app logo with colorful background and black friday tag

NFL Sunday Ticket Black Friday deal: Watch every out-of-market game for the rest of the season for under $50
NFL Sunday Ticket logo on purple backdrop with Black Friday sticker in corner

Trending on Mashable
NYT Connections hints today: Clues, answers for November 29, 2025
Connections game on a smartphone

Streaming just got cheaper: Black Friday deals still live on Hulu, HBO Max, Apple TV, Disney+, and more
Disney+, Hulu, HBO Max, Peacock, and Prime Video logos with colorful background and black friday icon

Wordle today: Answer, hints for November 29, 2025
Wordle game on a smartphone

The 23 best Black Friday PlayStation game deals still live (updated)
helldivers II, clair obscur, and silent hill f on pink background

NYT Strands hints, answers for November 29, 2025
A game being played on a smartphone.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!