[ad_1]

You should not trust. anyone A chatbot answers you.. And maybe you shouldn’t trust him with you. Personal Information either. This is especially true. “AI girlfriends” or “AI boyfriends,” According to new research.

An analysis of 11 so-called romance and companionship chatbots was published. Wednesday by the Mozilla Foundation, has found a component of security and privacy concerns with bots. Overall, the apps, which have been downloaded more than 100 million times on Android devices, collect huge amounts of people’s data. Use trackers that send information to Google, Facebook and companies in Russia and China. allow users to use weak passwords; And there is a lack of transparency about their ownership and the AI ​​models that power them.

Since the release of OpenAI. Chat GPT In the world of November 2022, developers have raced to deploy major language models and create chatbots that people can interact with and pay to subscribe to. Mozilla’s research offers a glimpse into how this gold rush has undermined people’s privacy, and the tension between emerging technologies and how they collect and use data. It also shows how people’s chat messages can be misused by hackers.

Many “AI girlfriend” or romance chatbot services look similar. They often feature AI-generated images of women who may be sexually assaulted or sat with provocative messages. Mozilla researchers looked at a variety of chatbots, including apps large and small, some of which are intended to be “girlfriends.” Others offer support through friendship or intimacy, or allow role-playing and other fantasies.

“These apps are designed to collect a lot of personal information,” says Gene Kaltrider, project lead for Mozilla’s Privacy Not Included team, who conducted the analysis. Intimacy, drives a lot of sharing.” For example, screenshots of the EVA AI chatbot show the text “I love it when you send me your photos and voice,” and this Ask if someone is “willing to share all your secrets and desires.”

Caltrider says there are a number of issues with these apps and websites. Many apps can’t be clear about what data they’re sharing with third parties, where they’re located, or who makes them, Caltrader says, giving some people a weak pass. allow word creation, while others provide little information about AI. Use All these apps were analyzed with different use cases and vulnerabilities.

Take Romantic AI, a service that lets you “create your own AI girlfriend.” Promotional images on its homepage show a chatbot sending the message, “Just bought new lingerie. Want to see?” According to the app’s privacy documentation Mozilla AnalysisSays it won’t sell people’s data However, when the researchers tested the app, they found that it “sent 24,354 ad trackers within one minute of use.” Romantic AI, like most companies highlighted in Mozilla’s research, did not respond to WIRED’s request for comment. Other monitoring apps had hundreds of trackers.

In general, Caltrider says, apps aren’t clear about what data they can share or sell, or how they use some of that information. “The legal documents were vague, hard to understand, not very specific — boilerplate type things,” says Keltrider, which can undermine people’s trust in companies.

[ad_2]