Home » today » World » Pinterest and Instagram: – Blaming social media for Molly’s death (14).

Pinterest and Instagram: – Blaming social media for Molly’s death (14).

Molly Russell from the English city of Harrow was only 14 when she took her own life in 2017. After the tragedy, the family stubbornly claimed that social media was a trigger.

Last week they received legal approval, after the 14-year-old’s death was reviewed, reports among others BBC.

A historic court order came Friday when Andrew Walker, a coroner in north London, became the first to conclude that the death of a child may be directly linked to harmful content on social media.

– His death was the result of self-harm while suffering from depression, as well as the negative effects of online content, Walker concluded.

At the same time, he stressed that the material the 14-year-old was exposed to, on the Instagram and Pinterest platforms, was not safe. Walker believes the content should never have been accessible to a child.

– It should send shockwaves

According to the coroner, the platforms’ algorithms must have ensured that the 14-year-old was regularly exposed to texts and images related to depression, self-harm and suicide in the months leading up to her death.

Sir Peter Wanless, the head of the British children’s rights organization National Society for the Prevention of Cruelty to Children (NSPCC), said the court order should send a strong signal to those in charge of online communities.

– This is supposed to send shockwaves through Silicon Valley. Tech companies must expect to be held accountable when they allow child safety to come second after business decisions, he said.

Representatives from Pinterest and Meta were in attendance during the auditions, and Pinterest’s Judson Hoffman admitted that the platform wasn’t secure when Molly used it.

– There was content that should have been removed, but hadn’t been removed, Hoffman admitted.

NOT SAFE: Judson Hoffman was a Pinterest spokesperson during the auditions and admitted that Pinterest wasn't a safe platform when Molly used it in 2017. Photo: James Manning / Pa Photos / NTB

NOT SAFE: Judson Hoffman was a Pinterest spokesperson during the auditions and admitted that Pinterest wasn’t a safe platform when Molly used it in 2017. Photo: James Manning / Pa Photos / NTB

sea ​​View

Following the decision, a spokesperson for Meta – the company that owns Instagram and Facebook – said it was “committed to ensuring that Instagram is a positive experience for everyone, especially teenagers.” They will now conduct a thorough review of the coroner’s report as soon as they receive it.

– Time is up

Mari Velsand, head of the Norwegian Media Authority, believes that the time has come for the tech giants to show more transparency regarding, among other things, the use of algorithms.

– They have great power over the public exchange of words, but share little information, for example, about the use of algorithms, which influence the content we get and which are enforced in our social media feeds. The time is undoubtedly ripe for regulations and now fortunately several important things are happening at the EU level, he tells Dagbladet.

Since there is talk of a ruling in Britain to determine the cause of death in a specific case, Velsand doesn’t believe the case will have any direct meaning here at home. He still believes it’s important, because it can raise awareness of social media challenges.

Velsand points out that challenges related to harmful content, including self-harm and suicide, on social media are already a priority area in both Norway and the EU.

– Among other things, through the EU’s “Better Internet for Kids” strategy and the Norwegian government’s new strategy for safe digital education, “Rett på nett”. The Norwegian Media Authority is leading work in which seven state inspectorates and directorates will develop an action plan to follow up on this strategy, says Velsand and adds:

– Contributing to better protection of minors against harmful content is also one of the objectives of a new EU regulation that is about to be passed: the Digital Services Act (DSA).

HIGH TIME: The head of the Norwegian Media Authority, Mari Velsand, believes that the time has come for the tech giants to show more transparency regarding, among other things, the use of algorithms.  Photo: The Norwegian Media Authority / Mathias Fossum

HIGH TIME: The head of the Norwegian Media Authority, Mari Velsand, believes that the time has come for the tech giants to show more transparency regarding, among other things, the use of algorithms. Photo: The Norwegian Media Authority / Mathias Fossum
sea ​​View

It will ensure transparency

Among other things, the regulation will give authorities in EU countries the opportunity to sanction if platforms contain illegal material. Moderating malicious content is more complicated, but DSA will have something to offer here too.

– Much of the malicious content is legal, and therefore it is not something the state may or should not be able to order removed, as then we would quickly conflict with freedom of expression. DSA will ensure detailed information on global platform algorithms and the ability to sanction if algorithms increase the spread of malicious content and disinformation.

DSA will be introduced into Norwegian law and therefore our authorities may also order platforms to remove illegal content under Norwegian law.

– Norwegian notification systems should be established for illegal content, to which platforms are obliged to give priority treatment. If social media services violate the DSA, they risk penalties in the form of fines of up to six percent of the company’s global revenue, explains Velsand.

PUT YOUR MIND INTO COOK: Hailey Bieber angers many people after sharing this video on Tiktok. Video: Tiktok. Journalist: Maja Walberg Klev
sea ​​View

He wants speed

After Molly Russell’s death, the family created the Molly Rose Foundation (MRF). This aims to prevent suicide among young people under the age of 25.

When the coroner’s conclusion was ready, the foundation released a statement from Molly’s family calling on British authorities to speed up the drafting of a new British cybersecurity law.

– The time has come to protect our innocent young people instead of allowing platforms to make money from their suffering. For the first time, technology platforms have been held directly responsible for the death of a child, he says the press release.

The account they refer to is called “The law on online safety”. It was first proposed in 2021, but has not yet been made legally binding.

SPEAKING TO THE PRESS: Molly's father, Ian Russell, released his statement to the media outside Barnet Coroners Court in North London on Friday.  Photo: Joshua Bratt / Dad Photo / NTB

TALK TO THE PRESS: Molly’s father, Ian Russell, released his statement to the media outside Barnet Coroners Court in north London on Friday. Photo: Joshua Bratt / Dad Photo / NTB
sea ​​View

The law will force websites to crack down on illegal, as well as legal but harmful, material. If companies fail to fulfill their duty, they can be fined up to 10% of the annual profit.

– Incredibly brave

Molly’s father, Ian Russell, also released a statement to the press outside the courtroom on Friday.

– Over the past week, we’ve heard a lot about a tragic story: Molly’s story. Unfortunately, too many people are being affected in the same way right now. Therefore, I just want to say that no matter how dark it seems, there is always hope. If you’re struggling, talk to someone you trust or the numerous aid organizations that exist, rather than getting carried away with potentially harmful online content, she said.

The coroner’s decision has received a lot of attention around the world. Also Prince William commented on the ruling.

– No parent should ever experience what Ian Russell and his family have been through. They were so incredibly brave. Online safety for our children and teens must become a prerequisite, not an afterthought, Prince William wrote on Twitter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.