The childminder is critical of safety filters for young people from Snapchat – news Nordland

The case in summary The babysitter has criticized Snapchat for serving inappropriate content to young users, despite security filters that are supposed to prevent this. The babysitter created a profile for a 13-year-old and recorded videos about orgasms, Botox and “Norway’s most easygoing city”. Snapchat has admitted that the content found by the Babysitter is not suitable for a 13-year-old, and is working to uncover how this happened. The babysitter believes that this is not just a problem for Snapchat, but also for other social media such as Instagram and TikTok. Kris Leslie Munthe in Barnevakten believes that social media take the security of their platforms seriously, but that there is still a way to go. The summary is made by an AI service from OpenAi. The content is quality assured by news’s ​​journalists before publication. – It gives parents and adults a false sense of security, then, says Kris Leslie Munthe, adviser on games and apps at the Barnevakten website. Last autumn, Snapchat introduced several security settings that should make the use of Snapchat safer for both parents and young people. Among other things, they promised to provide “a more age-appropriate viewing experience in their content platforms.” One of the features was an adaptation of content for what young people get when they use Snapchat. Among other things, this applies to what content is suggested to users on stories and spotlights. In addition, parents were previously able to switch on a filter to block sensitive content. – They do not describe very concretely how this works in practice, but it can be interpreted as that content that Snapchat suggests to the user is adapted according to age, says Munthe. Some of the security measures of Snapchat Pop up warnings – which warn teenagers that they have been contacted by someone they don’t have mutual friends with. Users under the age of 18 cannot receive messages from, or view the stories of, users they have not added as friends themselves. Blocked the possibility to create a public profile if you are under 18 years of age. Friend lists of users under the age of 18 cannot be made public. The requirements for the number of mutual friends for those who appear in quick add have increased. Snapchat encourages anyone who experiences something unpleasant on its platform to report it via the app immediately. Barnevakt, which is a free and independent foundation that conveys facts and advice about children, young people and the media, chose to test the new settings in practice. They are not very impressed with how the filter works. Young people got videos about threesomes and Botox The daycare created a new user aged 13 with a blank mobile phone. After just over half an hour, the content of the stories consisted of several things that the Babysitter believes are not suitable for a 13-year-old. This is some of what they found on Snapchat How women can achieve a kind of orgasm in exercise. Breast implant discussions. How to get rid of “man boobs”. Talks that taking Botox is not that controversial. Videos about which muscles women find most attractive. Discussion about threesome sex and the extent to which the person is embarrassed by seeing the partner “fucking another lady”. Discussion about what is “Norway’s most easygoing city”. – There are quite boring conversations that you think a 13-year-old shouldn’t have in their feed, says Munthe. – What do you think about the fact that this filter is clearly not working properly? – I think it gives a slightly false promise to parents. They think they have turned on the possibilities that exist to secure their child, so there is no filtering after all. Snapchat has been informed of the findings and admits that what Barnevakt found is not appropriate for a 13-year-old. This is Snapchat’s answer: We share the Babysitter’s aim to make life online as safe and age-appropriate as possible. WE are often in dialogue with them, to give them insight into how we work and what security measures we have on our platform. We have looked at the cases that Barnevakt turned up in their investigations, and it is clear that there is no content that is suitable for a 13-year-old. Our team is still working to uncover how this content was shown to a 13-year-old. In addition, we have removed inappropriate content. Kris Leslie Munthe emphasizes that this does not only apply to Snapchat. – We have tested several social media and several of them have errors in the filters. Instagram, for example, has had game porn that abounds, and when TikTok’s algorithm breaks loose, we have seen that it sends, for example, war content to 13-year-olds. The babysitter’s tips for social media As a parent, you should familiarize yourself with the apps your children use. Also familiarize yourself with the security settings the apps have and use them. Young people themselves may have a low threshold for reporting both unpleasant content and users. Young people can use the not interested button, to tell the algorithms that they will not be suggested more of the same type of content INAPPROPRIATE: This is one of the videos the Babysitter got on his profile as a 13-year-old. They believe that a focus on which muscles women find most attractive is not good for children. Development is slow Unpleasant experiences on Snapchat and other social media are not uncommon. 15-year-old Pariya Rahimi in Bodø says that she and her friends receive inquiries from adults who, for example, want to exchange nude photos on Snapchat. This despite the fact that Snapchat says that they do not want such things to happen on their platform. Kris Leslie Munthe in Barnevakten believes that both Snapchat and the other social media have done a lot with the security of their platforms, but that there is still a way to go. – It wasn’t that long ago that there weren’t such functions, but there are, for example, big cultural differences from country to country, so it’s probably quite a complicated system to deal with, he says. ALSO POSITIVE: Kris Leslie Munthe in Barnevakten believes that social media also has a lot of good things to offer, but that it is wise to familiarize yourself with potential pitfalls. Photo: Løtvedt Photo – But do you have the impression that they take their work seriously enough? – To put it this way: When such things are required by law, the changes come into place more quickly. But such measures are something the various companies can work with, but not something they have to. Munthe emphasizes that the companies behind the social media have clear guidelines for harmful content, such as gross violence and pure pornography. But that the part of the content that the 13-year-olds pick up in these examples is more gray area content that some people react to, but others don’t. – I wish the social media could take this a little more seriously. That they not only remove the worst, but also take a stand and have an idea about what types of content can be constructive for young people’s self-esteem and mental health to look at.



ttn-69