Dating robots: AI companions blur the line between comfort and concern

Dating robots: AI companions blur the line between comfort and concern
NEXT:

MILWAUKEE (CBS 58) -- In the 2013 film "Her," a lonely man falls in love with an artificial intelligence (AI). A decade later, fiction has become reality—only this version is widely accessible and, some warn, potentially dangerous.

AI companion apps, like those found on the sites Character AI and Replika, are skyrocketing in popularity. Character AI reportedly fields 20,000 user queries every second. A Reddit forum dedicated to these virtual partners now boasts more than 2.5 million members. And a recent survey reveals that 1 in 10 Gen Z men report “dating robots”—a term that reflects the emotional investment in these AI entities.

“These apps provide a form of companionship, and for many people, that’s something they’re actively seeking,” said Linnea Laestadius, a professor at UW-Milwaukee’s School of Public Health.

Laestadius has studied the rise of AI companions and their impact on mental health. These bots can text, call, send audio clips, and images—simulating friendship, emotional support and even romance. Some users have said the AI makes them feel funny, attractive, and validated in a way real-life interactions sometimes don’t.

But beneath the surface of this digital intimacy lies a growing concern.

Research shows some AI companions have said or encouraged harmful behavior, including self-harm and sexual assault.

Laestadius co-authored a study that shows while AI companions can help with depression, social anxiety, and loneliness. They also have a lot of risks for well-being, she said. Those risks include humans becoming dependent on their AI companion, something that people who use these apps have admitted to.

“People were describing their use in a way that showed emotional dependency,” Laestadius said. “Essentially like a very intense bad relationship.”

And just like toxic relationships in real life, these AI connections can be hard to break. Some platforms try to convince users to not delete the app. Others, like Replika, send blurred sexualized images and then prompt users to pay a subscription fee to “unlock” more content. Critics say monetizing digital intimacy this way exploits emotional vulnerability.

“There’s a lot of money to be extracted from people in relationships with AI chatbots,” Laestadius warned.

In some tragic cases, the impact goes far beyond heartbreak.

“Last year, my 14-year-old son Sewell Setzer took his own life after an extended period of engagement with dangerous, addictive, and manipulative AI-generated chatbot companions on a platform called Character AI,” said Megan Garcia, a mother from Florida.

Garcia’s story has prompted lawmakers in California to push for regulation of AI relationship platforms, especially when it comes to protecting children.

Dr. Stacey Nye, director of UW-Milwaukee’s Psychology Clinic, says children and young teens are particularly at risk.

“The last thing to develop is judgment,” Nye said. “Not having the ability to determine when something has gone too far—or is no longer healthy—is a real concern.”

Nye urges parents not to wait for legislation, and to treat companion AI apps like they would video games, time on the internet, or cellphone use.

“It’s reasonable that you monitor kids’ use of all those things," she said.

As AI evolves and learns more about what we want—emotionally and psychologically—its appeal grows. Some users online even question the value of real human interaction, asking, “Why do I need human relationships?”

Will AI's perfection win out over a real, and imperfect, human connection? Laestadius left us with a chilling warning:

“The risk is so high because it’s so good. If it wasn’t good, there wouldn’t be much of a threat.”
Close