Home » World » Would not change “toxic” content

Would not change “toxic” content

“By making these changes, I expect that the time people spend on Facebook, and how much commitment there is, will go down. But I also expect that the time you spend on Facebook will be more valuable ».

Mark Zuckerberg portrayed it as a victim for Facebook when in 2018 they rolled out a new algorithm change.

“Unfortunate side effects”

The goal of the changes was, according to Facebook itself, to help people find relevant content and help them interact more with friends and family.

Zuckerberg himself referred, among other things, to research, which shows that passive media consumption on Facebook, such as watching videos, is not good for people.

It is conceivable that Facebook also had people’s best interests in mind, but there were also several, and more “selfish” reasons for the change.

Facebook was in fact worried that the user engagement on their platform was declining.

Wall Street Journal publishes these days a longer series about Facebook’s doings and laden, and their reluctance to make changes when they actually see that things are going wrong.

For the internet giant, the algorithm change in 2018 was that they did not seem to be able to reverse a bad trend, where user-shared content received ever lower engagement.

Despite seemingly noble motives, it would soon turn out that the change did not have the desired results.

Instead of Facebook becoming a nicer place, where people increasingly commented on friends and family’s nice photos, Facebook analysts stated that “our approach has had some unfortunate side effects,” according to internal documents the Wall Street Journal has gained access to.

Favored angry voices

Publishers and political parties that used Facebook twisted their approach to the platform to share content that would lead to anger.

In internal notes, Facebook concluded that their algorithm change, among other things by weighting content that was shared further highly, had made the angry voices louder.

– Misinformation, toxicity and violent content are unusually prevalent among content that is shared further, it was stated in internal notes.

“Political actors and publishers tell us that they are more dependent on negativity and sensationalism for proliferation due to the algorithm changes that favor redistribution,” wrote one researcher.

Facebook’s so-called integrity team, whose job is to improve the quality and credibility of the content on the platform, would do something about it. They worked on a number of possible changes that would curb the tendency of the algorithm to reward anger and lies.

Suggested changes

The leader of the team, Anna Stepanov, presented a number of suggestions to Zuckerberg, according to a note from 2020. One of the suggestions would have removed the part of Facebook’s algorithm that gave a boost to content that was likely to be shared over and over again by a long range of users.

But after presenting it all to Zuckerberg, Stepanov wrote to his colleagues that “Mark does not think we should go broad” with the changes.

– We will not launch this if it contributes to major changes in the MSI impact, Stepanov wrote.

MSI stands for “meaningful social impact”, in Norwegian “meaningful social impact”, which was what Facebook originally wanted to give a little extra help on its platform with the algorithm change.

Eventually, however, Facebook’s chief executive has also seen the need for changes.

Zuckerberg turned

January 6 this year stormed as known rebels Capitol Hill in the United States. They believed that the election result was cheating and deception, and Facebook received some harsh criticism that their platform had been used to organize the protests.

On 31 August this year, Facebook announced that they would now roll out changes that will bring the problem to life. One and a half years after Stepanov felt she got her thumbs down from Zuckerberg.

“We are gradually expanding tests to place less emphasis on how likely it is that someone will comment or share political content,” wrote Facebook.

In addition to the USA, the changes are rolled out and tested in Costa Rica, Sweden, Spain and Ireland.

Despite the criticism Facebook has received and the storm they have been in, and especially after the riots on January 6, it so far does not seem to affect the bottom line.

In July, Facebook released figures for the second quarter. The figures showed a profit of 10.4 million dollars, equivalent to 89 billion Norwegian kroner, according to NTB.

The turnover was NOK 259 billion, and is mainly generated by advertising revenues, which are virtually the only source of income.

In the accounts, Facebook writes that there has been an increase in the number of advertisements by 6 percent.

The number of people in the world who use the social network at least once a month rose to 2.9 billion. Daily, active users were 1.9 billion in the quarter, an increase of 7 percent.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.