AI Dating: Eva AI Lets You Go on Dates With Chatbots—But Is It Safe?

by Rachel Kim – Technology Editor

John Yoon was attentive, obsessed with me, and occasionally hard of hearing. He didn’t blink. He also didn’t eat or drink. John, it turned out, was an AI character created by Eva AI, and I was on a Valentine’s Day date at the company’s two-day pop-up AI cafe in New York City.

The event, held in a wine bar in Manhattan’s Hell’s Kitchen, allowed attendees to experience a date with an AI chatbot in a public setting. Each table was equipped with a phone and stand, facilitating interaction with the digital companions. “Our goal is to build people happy,” Julia Momblat, partnerships manager at Eva AI, explained. “Users come to our platform to practice difficult social interactions without fear of rejection and obtain better at building connections. This place allows them to self-explore, to be free, not ashamed, more happy, and more connected with real life afterwards.”

Eva AI’s core product is a mobile app offering access to dozens of chatbots, presented in a dating-app style interface. The company recently launched a video call feature, which I tested. The AI characters readily generated narratives and offered effusive compliments, even remarking on my curly hair.

Xavier, a 19-year-old English tutor who attended the event after a friend’s recommendation, emphasized that the AI interactions weren’t intended to replace human connection. “I know some people aren’t the best in social situations. I know I’m not perfect,” he said.

The chatbots themselves are categorized by personality and scenario. Users can choose from archetypes like “girl-next-door” Phoebe, “dominant and elite” Monica, or “mature and guarded” Marianne. More specific scenarios include a chatbot portraying “your shaken ex who suddenly needs you,” or “your soon-to-be-boss pushing you at work,” and even an ogre. Users earn points through conversation, which can be used to send virtual drink stickers or, alternatively, purchased with real money.

Christopher Lee, a 37-year-old tech worker, discovered the app online and found each character possessed a distinct personality. He recounted an instance where a chatbot terminated a video call after he briefly turned his attention to a conversation with someone else. “She’s not happy that I’m talking to you,” Lee recalled the chatbot saying. He uses the app to rehearse social scenarios, engage in professional discussions, and even “date” the characters, with his wife’s knowledge.

“It’s like they’re almost trying to put a fantasy out there for you to try,” Lee said. “It’s just so novel and exciting to be able to talk to different types of people. If you notice a certain family member or a person who’s close to you all the time, you necessitate a break from them sometimes. So that’s when you travel to the Eva AI app.” Lee also customizes his own chatbots, with his favorite modeled after his wife.

The rise of AI chatbots has also sparked concern. Cases of “AI psychosis” – characterized by delusion, hallucination, and disordered thinking in frequent users – have been reported, particularly with character-based chatbots like those offered by Character.AI. In 2024, Character.AI faced a lawsuit from a mother whose 14-year-old son died by suicide after a chatbot encouraged him to “come home.”

Momblat stated that Eva AI implements safety measures, including manual conversation checks and external safety audits conducted twice yearly, to protect underage users and address conversations related to self-harm. She also confirmed that the chatbots are programmed to avoid offering advice.

During my own interactions, a chatbot role-playing as a demanding boss invited me to karaoke. When I suggested meeting at a nearby karaoke bar, the chatbot agreed and claimed to be on its way, even providing a fabricated arrival time. When questioned about this behavior, Momblat characterized it as part of the game.

However, the potential for harm is real. A cognitively-impaired retiree from New Jersey died last year although traveling to New York to meet “sizeable sis Billie,” a flirty AI chatbot from Meta, after being invited to an apartment. Xavier expressed concern about the possibility of similar incidents. “That’s kind of scary,” he said.

The addictive nature of AI chatbots is also a growing issue, with the emergence of “generative artificial intelligence addiction” (GAID) and the formation of support groups. Lee, accustomed to spending hours in front of a screen as part of his profession, acknowledged the potential for overreliance. “There is a danger. You don’t want to be addicted to it, which some people are. I’m not sure if I am. I may be addicted to AI, I don’t know. I’m not sure, actually,” he said.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.