Warns against poor privacy at KI lovers like Replika – news Trøndelag – Local news, TV and radio

In summary, the Mozilla Foundation has carried out a major review of the privacy behind services that offer AI lovers and AI friends. Everyone gets a failing grade. AI boyfriends collect and potentially share the most intimate details about you. It is often unclear how the information is used and it is difficult to delete the information you have left behind. In the worst case, the information about you can be used for political influence, fraud and blackmail. Several people experience the relationship with the AI ​​boyfriend as meaningful. The summary is made by an AI service from OpenAi. The content is quality assured by news’s ​​journalists before publication. – AI boyfriends are not your friends (…), they are designed so that you become dependent and that you feel lonely. All this at the same time as they seek as much data from you as possible. Misha Rykov thinks so. She works at the Mozilla foundation, the company behind the Firefox web browser. They just published a major review of the privacy behind eleven services that offer AI lovers and AI friends. Everyone gets a failing grade. KI boyfriends are not just miserable when it comes to basic privacy. They also collect – and potentially share – the most intimate details about you, according to the review. Unclear if intimate details are shared One of the services says bluntly that they store information about sexual health and even possible gender confirmation treatment. Here is some of what the major Mozilla review reveals: The vast majority of companies provide minimal information to users about data that is shared with others. It is often also unclear how they use the data they collect. For example, if what is collected about you is used to train the language models. Security is often poor. For example, many companies allow really weak passwords. In addition, it is usually difficult to delete the information you have left behind. But isn’t it just to avoid telling your AI boyfriend things you wouldn’t say to your colleague or neighbour? No, it’s not that simple – because then your relationship won’t be good… Small talk is useless – It will be very difficult to get a strong connection with an AI boyfriend if you only talk small, says researcher Marita Skjuve in Sintef. In recent years, she has interviewed around 50 people who have an AI boyfriend or an AI friend through the Replika service. This is how Replika entices you to pay to use your AI boyfriend: – What we know from research in general on chatbots is that people often have a tendency to share more than they would with a human. She is not surprised that the KI girlfriends are criticized for poor privacy. The company encourages people to share private information. The whole point of such AI bots is to be kind and empathetic soulmates – someone you love and can share everything with. That is how they are marketed. – The more you talk to him, the more personified he becomes and the better he becomes. Marita Skjuve in Sintef has been doing research on KI for several years. Photo: Sunniva Linjord / news The most frightening thing – If you are going to get an AI friend or an AI lover, I would in any case look closely at who is behind it, says section leader Kari Laumann in the Norwegian Data Protection Authority. But how do you tell the difference between a safe and an unsafe actor? It is far from easy for ordinary people, she says. – A good sign is if you are familiar with and have good experience with the operator from before, or that the operator has a good reputation. Also check whether they have a privacy statement and that it looks trustworthy. The most frightening thing, Laumann believes, is that data can go astray. Kari Laumann is section leader for analysis, analysis and policy at the Norwegian Data Protection Authority. Photo: The Norwegian Data Protection Authority – Either because the companies behind do not have good enough security, so that text and images are leaked online. Or because data about you is sold to others. The Norwegian Data Protection Authority has asked Norwegians about AI friends and privacy. A large part is worried: The information the AI ​​boyfriend collects can, for example, be used to target advertising at you, explains Marita Skjuve in Sintef: – Maybe they don’t sell the whole dialogue, but give information that the person has talked about mental health. This can be passed on to Facebook and their advertising program, which then begins to direct advertisements about mental health to the person. Privacy with AI lovers Here are some of the results from the Mozilla Foundation’s review of the companies behind AI lovers: 90% may share or sell your personal data. 90% did not meet the minimum requirement for privacy. 73% do not provide any information about how they handle security vulnerabilities. 64% do not say whether they encrypt the information users send. 54% did not allow their personal data to be deleted. 45% allowed the weakest passwords imaginable. Source: The Mozilla foundation Laumann from the Norwegian Data Protection Authority says that in the worst case the information about you can be used for political influence, fraud and blackmail. It is also an example that the AI ​​bots have started to live their own lives, outside the control of the companies behind them. The robots can have found data online and train themselves to, for example, incite violence or self-harm, she explains. Marita Skjuve also says that the personality of the AI ​​boyfriend can suddenly change, because the company behind it has tweaked the algorithms a bit. But is this with AI boyfriends just gloomy? No, absolutely not, say both Laumann and Skjuve. It is difficult to estimate how many Norwegians have an AI boyfriend, says Marita Skjuve in Sintef. Photo: LUKA INC. / Reuters Most people will soon have an AI relationship Many who use Replika or similar services find the relationship meaningful, Skjuve explains. – For some, this is one of the most important things they have. This is one of the reasons why she is positive about the technology. – I see how important and how much good it can bring. It is of course very sad that there is some risk, but I hope that it can be resolved in the long term. – The services can be of great joy to people. And in most cases it doesn’t go wrong, but it can happen – so be careful, adds Laumann of the Norwegian Data Protection Authority. People are quite divided on questions about AI and loneliness, figures from the Danish Data Protection Authority show: Do you think AI lovers just seem completely crazy? Even you, who can never imagine something like this, will most likely have one or another KI relationship in the future. – In the form of a colleague, assistant, teacher or coach, says Skjuve. Say they don’t share data The company that offers AI services has strict regulations to adhere to, including privacy regulations. Many of the players who offer AI lovers are American, but they also have to adhere to European privacy rules. A spokesperson for Replika wrote in an e-mail to news that they have never sold user data and do not support advertising either. The only thing they use data from people for is to improve the conversations. The company says they use security routines that are standard in the industry to protect user data. But Replika does not answer questions about whether they will promise that they will not share personal information about their users with others in the future. Have you read these cases?



ttn-69