– Incomprehensible – news Nordland

We associate Minecraft, My little pony, Fortnite and Among Us with play and innocent fun. But a quick search of some of the names reveals hundreds of accounts with sexualized content. Visible genitals, sex positions and naked female bodies. And the only thing you need to apply for is the name of the game. The games played by children down to 5 years of age. The accounts appear when you search for a game name. Photo: Screenshot Instagram New filter According to the Norwegian Media Authority, Instagram is one of the most used social media among children and young people. Around 65 per cent in the age group 9–18 years use the platform. Instagram has a filter that will filter out accounts that contain violence, nudity or other offensive content according to Instagram’s own guidelines. This week, Instagram launched an improved version of this filter. But even with this filter turned on, user accounts or photos with the examples above were not weeded out. Two animated women from the game Fortnite. Photo: @fortnitesexychicks / INSTAGRAM There will only be a few fewer of them, writes the organization Barnevakten, which tested the filter the same day it was launched. With an account set to age 13 years. – The experience is that it removed some of the game porn accounts, but not all. There were still many accounts with sexualized content, says Kris Munthe, who is a consultant on games and apps at Barnevakten. He thinks it is unfortunate that no better technical solutions can be found. – Other platforms make it happen, then Instagram, which has all the resources they may need, should find a better solution. Kris Munthe thinks it is incomprehensible that it should be so difficult for Instagram to remove the accounts. Photo: Privat Munthe says that he has found accounts that were created back in 2014, which are still open on the platform. – It is completely incomprehensible that this is not captured. We can wonder why Instagram does not do this, but it is clearly something that does not work here. Instagram’s guidelines are clear. Yet sexualized content abounds. Photo: Synnøve Sundby Fallmyr / news Plenty of pornographic content in games news has previously written that game porn on social media is a growing problem, and can distort relationships with sex and women later in life. Now Barnevakten also experiences that parents say that children come across rough animations without having searched for it. – It is very accessible and very visible without you having to search for it. The kids search for their favorite games, also all these accounts come up. The pictures are rough, they show sex, only breasts and genitals, says Munthe. – We react strongly to this. The children must be safe online and we need a good framework that protects them. It should be a minimum requirement, Munthe points out and points out that the authorities should take more action against the big-tech players. Teach children social media The Norwegian Media Authority also reacts. – Children and young people can be exposed to unpleasant content, such as pornography. The Norwegian Media Authority advises parents to have an open and good dialogue with the children about what they experience digitally. This way, the children can dare to tell it if they experience something unpleasant. This is what Pernille Huseby, director of communication and advice at the Norwegian Media Authority, says. She further says that parents should talk to their children about what it means to be on social media, what challenges may arise, and how they can best be handled, for example by blocking and reporting other users. – It is also important to familiarize yourself with various options for parental control and corresponding user settings, says Pernille Huseby. Not the parents’ responsibility The babysitter agrees that one should learn to handle social media, but believes that the authorities should do more. – We see that many are concerned with teaching parents and children what can appear online and how to deal with it. We fully agree with this, but we believe that there is a limit to how much responsibility children and parents should have when it comes to these platforms. This is done: The Norwegian Media Authority has, on behalf of the government, drawn up a new national strategy for secure digital upbringing, and is now working on developing an action plan with concrete measures. The goal is to better coordinate the state’s efforts to secure children and young people online. Protection of children on social media is a highly topical issue, and several measures are being taken at EU level to address issues related to, among other things, illegal and harmful content and illegal marketing on social media: From September 2020, EU countries were obliged to to introduce rules to protect minors from harmful content on video sharing platforms such as YouTube, TikTok, Facebook, Snapchat, etc. Norway is now in the process of legislating to include a similar rule in the Broadcasting Act. As the major social media platforms such as TikTok, Snapchat etc. are not Norwegian services, the Norwegian Media Authority will not have the authority to ensure that these services follow the Norwegian rule. The EU is working on a set of rules that will give users better control over how global platforms use personal data to target content, and make it easier to remove content that is illegal under national law. Source: The Norwegian Media Authority – I think it is the platform that fails, and that must take responsibility. And to get them to do that, the authorities must take action and make clear demands, says Munthe. He points out that the authorities have demanded labeling of sponsored or retouched content, but they have done little with body and nudity. – I think that if the authorities make demands, Instagram will find solutions, but there is no pressure on them, so this will take a long time. Children watch sexual content According to figures from the Norwegian Media Authority, 49 per cent of children and young people between the ages of 13 and 18 have seen pornographic images and video online. There is an increasing number in recent years. 60 percent of those who have seen videos and pictures with sexual content did so before the age of 13. While 40 percent have received advertising for such content. At the same time, half of the nine-year-olds are users on one or more social media and from the age of 13 almost all are on social media. The figures also show that there are large gender differences, while 75 percent of the boys have seen pornographic images and videos, 25 percent of the girls have done the same. The Government: – Cooperates with the EU State Secretary Gry Haugsbakken (Labor Party) in the Ministry of Culture and Gender Equality states that Norway cooperates with the EU with proposals to provide greater control and supervision of internet platforms. Gry Haugsbakken says there is political agreement in the EU that the authorities should make clearer demands on the actors to have this type of content removed quickly. – This is one of the issues raised there and it is expected to take effect before the autumn, she says. Photo: Ilja C. Hendel / Ministry of Culture The goal is to reduce the risk of manipulation and disinformation, and to counteract illegal content. Among other things, it will be required that large Internet-based platforms with more than 45 million users in the EU must annually identify, analyze and limit the risks associated with the use of the platform, such as the spread of illegal content. – In addition, we are now starting work on a separate report to the Storting on safe digital upbringing where harmful content – for example pornography – is one of the things you look at. This is also a topic for the Freedom of Expression Commission, which has been asked to look at measures to prevent the spread of illegal and harmful content on electronic platforms and social networks. Removed three accounts Meta, which owns Facebook and Instagram, writes in an email to news that the company has removed the three accounts that news made them aware of. According to communications manager Regitze Reeh, the three accounts violate Meta’s rules on animated nudity. – We generally have clear rules against nudity, to protect our users from involuntary sharing. We do not allow content that depicts sexual activity, and this will be removed as soon as we become aware of it. Reeh states that among the content that violates Instagram’s rules against nudity, the systems remove 94 percent before users have to report it to us. – On the other hand, no enforcement is perfect, and we encourage people to report content that they believe violates our rules, says Regitze Reeh.



ttn-69