Molly Russell from the English city of Harrow was only 14 when she took her own life in 2017. After the tragedy, the family stubbornly claimed that social media was a trigger.
Last week they received legal approval, after the 14-year-old’s death was reviewed, reports among others BBC.
A historic court order came Friday when Andrew Walker, a coroner in north London, became the first to conclude that the death of a child may be directly linked to harmful content on social media.
– His death was the result of self-harm while suffering from depression, as well as the negative effects of online content, Walker concluded.
At the same time, he stressed that the material the 14-year-old was exposed to, on the Instagram and Pinterest platforms, was not safe. Walker believes the content should never have been accessible to a child.
– It should send shockwaves
According to the coroner, the platforms’ algorithms must have ensured that the 14-year-old was regularly exposed to texts and images related to depression, self-harm and suicide in the months leading up to her death.
Sir Peter Wanless, the head of the British children’s rights organization National Society for the Prevention of Cruelty to Children (NSPCC), said the court order should send a strong signal to those in charge of online communities.
Raise the alarm on sexual blackmail
– This is supposed to send shockwaves through Silicon Valley. Tech companies must expect to be held accountable when they allow child safety to come second after business decisions, he said.
Representatives from Pinterest and Meta were in attendance during the auditions, and Pinterest’s Judson Hoffman admitted that the platform wasn’t secure when Molly used it.
– There was content that should have been removed, but hadn’t been removed, Hoffman admitted.
Following the decision, a spokesperson for Meta – the company that owns Instagram and Facebook – said it was “committed to ensuring that Instagram is a positive experience for everyone, especially teenagers.” They will now conduct a thorough review of the coroner’s report as soon as they receive it.
– Time is up
Mari Velsand, head of the Norwegian Media Authority, believes that the time has come for the tech giants to show more transparency regarding, among other things, the use of algorithms.
– They have great power over the public exchange of words, but share little information, for example, about the use of algorithms, which influence the content we get and which are enforced in our social media feeds. The time is undoubtedly ripe for regulations and now fortunately several important things are happening at the EU level, he tells Dagbladet.
Tore (47): – I don’t remember anything
Since there is talk of a ruling in Britain to determine the cause of death in a specific case, Velsand doesn’t believe the case will have any direct meaning here at home. He still believes it’s important, because it can raise awareness of social media challenges.
Velsand points out that challenges related to harmful content, including self-harm and suicide, on social media are already a priority area in both Norway and the EU.
– Among other things, through the EU’s “Better Internet for Kids” strategy and the Norwegian government’s new strategy for safe digital education, “Rett på nett”. The Norwegian Media Authority is leading work in which seven state inspectorates and directorates will develop an action plan to follow up on this strategy, says Velsand and adds:
– Contributing to better protection of minors against harmful content is also one of the objectives of a new EU regulation that is about to be passed: the Digital Services Act (DSA).
It will ensure transparency
Among other things, the regulation will give authorities in EU countries the opportunity to sanction if platforms contain illegal material. Moderating malicious content is more complicated, but DSA will have something to offer here too.
– Much of the malicious content is legal, and therefore it is not something the state may or should not be able to order removed, as then we would quickly conflict with freedom of expression. DSA will ensure detailed information on global platform algorithms and the ability to sanction if algorithms increase the spread of malicious content and disinformation.
DSA will be introduced into Norwegian law and therefore our authorities may also order platforms to remove illegal content under Norwegian law.
– Norwegian notification systems should be established for illegal content, to which platforms are obliged to give priority treatment. If social media services violate the DSA, they risk penalties in the form of fines of up to six percent of the company’s global revenue, explains Velsand.