Privacy Experts: AI Girlfriends Are a Data Harvesting Nightmare

Digi AI girlfriend
Digi

AI developers creating romantic chatbots to serve as AI girlfriends and boyfriends for lonely people are able to harvest an entirely new set of data from unsuspecting users, as the bots collect details far more personal than a typical app. A privacy expert studying the aps says, “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

AI-generated romantic partners are unsurprisingly harvesting extremely personal information, and virtually all of them sell or share the data to other entities, a new study from Mozilla’s *Privacy Not Included project reveals.

“To be perfectly blunt, AI girlfriends are not your friends,” Mozilla Researcher Misha Rykov said. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

AI Girlfriend Lexi Love (@IntriguePublications/@LexiLove.x)

Mozilla examined 11 AI romance chatbots: EVA AI Chatbot and Soulmate, iGirl: AI Girlfriend, Anima: AI Friend & Companion, Anima: My Virtual AI Boyfriend, Romantic AI, Chai, Talkie Soulful AI, Genesia AI Friend & Partner, CrushOn.AI, Mimico – Your AI Friends, and Replika: My AI Friend.

Every single chatbot was hit with Mozilla’s “Privacy Not Included” label, making these AI tools some of the worst products Mozilla has ever reviewed.

AI girlfriends violate users’ privacy in “disturbing new ways” with regards to data, Mozilla noted.

“They can collect a lot of (really) personal information about you,” the company said. “But, that’s exactly what they’re designed to do!”

Mozilla claims that CrushOn.AI, for example, admits in its privacy policy that is collects extensive personal — and even health-related — information from users, such as sexual health and use of prescribed medication, among other details.

“Usually we draw the line at more data than is needed to perform the service, but how can we measure how much personal data is ‘too much’ when taking your intimate and personal data is the service?” Mozilla asked.

EVA AI Chatbot and Soulmate, for example, encourages users to “share all your secrets and desires,” and even asks for photos and voice recordings — prompting people to hand over data far more personal than they typically would while using other types of apps, according to the report.

Meanwhile, 90 percent of the apps may sell or share user data, and more than half of them won’t let users delete the data they have collected.

There were also security concerns, as Mozilla found that AI romantic apps were using an average of 2,663 trackers per minute, with Romantic AI using a staggering 24,354 trackers in just one minute of app use.

Inconsistencies were also found with regards to what the AI tools are actually used for.

Romantic AI, for example, claims to “maintain your MENTAL HEALTH,” but the company’s terms and services distance the app from its own claims, saying it is “neither a provider of healthcare or medical Service nor providing medical care, mental health Service, or other professional Service.”

As Breitbart News reported, heartbroken singles are now using an AI tool to create clones of their exes after breakups.

You can follow Alana Mastrangelo on Facebook and X/Twitter at @ARmastrangelo, and on Instagram.

COMMENTS

Please let us know if you're having issues with commenting.