Home » today » Technology » Facebook: Ex-employee Sophie Zhang unpacks

Facebook: Ex-employee Sophie Zhang unpacks

Dhe legal situation is clear. It is forbidden to incite violence online, and anti-Semitic motives are seen as aggravating the punishment. The operators of large social networks are now obliged to delete criminal content and report it to the investigative authorities.

At the beginning of April, Germany tightened the laws against so-called hate crime – on the grounds that communication was becoming increasingly brutalized on the Internet.

But how effective is the law – and what is Facebook doing to comply with the new rules?

If you look around on the platform, you will still find many posts there, which at least laypeople are not necessarily clear at first glance: Are these still opinions that clearly violate the rules of good taste, but are allowed – or are they calling for violence? The transition from one to the other is not easy to see in places.

This is shown, for example, by the exchange of views in a Facebook group that is critical of vaccinations and calls itself “We are the healthy 99.97%”: First, a user named Petra A. writes that vaccinations are “at full risk and regardless of losses”, including a video by the former SPD member of the Bundestag Wolfgang Wodarg is linked, in which he warns that vaccinated people will become “guinea pigs”. These posts are covered by freedom of expression, that much is clear.

also read

– –

History of the vaccination review – – – – –

But then Petra A writes: “The government should be held accountable for all crimes against its own people” – whether by that she means violence should be exercised against members of the government remains unclear. Another user, Roland G., increases this again significantly, he writes: “Put everyone against the wall”. Clearly a call to violence.

Facebook users could report such posts to the platform operator. But if you want to use this option, you have to be a lawyer yourself: “Select a paragraph from the German Criminal Code”, it says on the form, “Why do you think the content violates the selected paragraphs?”

Overstrained moderators

It seems that Facebook’s moderators are also overwhelmed. It is your job to decide within 24 hours for each reported post: Is the posting covered by freedom of expression or not? In fact, many posts that consistently call for violence remain standing for a long time, often without comment.

This may be due to the sheer mass of contributions that they accrue. Right now, in the summer before the federal election, Corona deniers, AfD supporters and those of the “Reichsbürger” ideologies write a lot in the forums of the platform that could be justiciable.

The Facebook moderators have also become more careful when deleting. Because in the recent past, the group had to take several slaps in court: Its moderators had also blocked posts that judges later found: They were covered by freedom of expression.

Listen too

– – – – –

For example, the lawyer Joachim Steinhöfel had sued against it. In the meantime, users who report content on the “Reichsbürger” ideology initially get an email back: “We cannot see that the content you reported is illegal.”

From the point of view of the Green Bundestag member Renate Künast, Facebook is not doing enough to counter threats of violence on its platforms. It assumes that the group is acting too laxly against justiciable content for economic reasons. “While Facebook says the platform wants to connect people, it’s actually a huge economic model. The more money is made with it, the more the content is influenced. “

The Bundestag member Renate Künast has been taking legal action against Facebook for a long time – –

The Bundestag member Renate Künast has been taking legal action against Facebook for a long time

Quelle: picture alliance / Geisler-Fotopress

– –

Künast has already had legal disputes with Facebook. It was about the very contentious point: When are hate messages that are distributed via the platform no longer pure expression of opinion, but justiciable? The politician had been denigrated in comments on Facebook as a “dirty pig” and “slut”.

After the Berlin regional court ruled that the statements were covered by freedom of expression, the next higher instance, the Berlin Higher Regional Court, decided that the insults were punishable.

Künast is of the opinion that there are far too many hate messages on Facebook, and this in turn restricts freedom of expression on the platform. “According to a study, 58 percent of women do not dare to express their political opinion online for fear of hatred and calls for violence.”

This in turn is dangerous with a view to the upcoming elections. If people no longer dare to express their opinion freely on Facebook, says Künast, this would distort the mood.

How Facebook thwarted a critic

There is also internal criticism of Facebook’s handling of hateful comments – and targeted disinformation campaigns.

Data scientist Sophie Zhang worked for Facebook for two years, from 2018 to 2020. The company hired her to find fake user accounts. During this assignment, she came across extensive disinformation campaigns on the platform, in countries such as Honduras, Azerbaijan and Bolivia.

But when she pointed this out internally, nothing happened for months. Facebook, she says in retrospect, was probably not interested in remedying the grievances. So she quit – and turned down a hush money of $ 64,000. She made public the problems with disinformation the social network has.

Data analyst Sophie Zhang worked for Facebook for more than two years – –

Data analyst Sophie Zhang worked for Facebook for more than two years

What: AP

– –

WORLD ON SUNDAY: Ms. Zhang, there is a slogan on Facebook, “no problem on Facebook is someone else’s problem”. Employees should therefore get involved when they discover grievances. But what happens if you take it seriously and how you point out problems?

Sophie Zhang: Hey, this saying looks great on stickers and posters – but actually it makes you unpopular if you act on it. Another of those slogans is, “What would you do if you weren’t afraid?” I think Facebook didn’t like my answer to that. A fundamental problem is that Facebook has grown so big that the left hand no longer knows what the right hand is doing. So problems remain. When I wrote to all employees about problems when I quit, Facebook did everything to take my post offline.

WORLD ON SUNDAY: How seriously does Facebook take user complaints?

Zhang: User complaints weren’t my job – but in general I would say that Facebook doesn’t respond to ads and complaints as much as people outside the company might think. Much goes under. Facebook’s networks are huge, and there are so many user complaints that the company doesn’t have the people to look at it all.

also read

Deepfakes: With this tool you can expose manipulated videos – – – – –

WORLD ON SUNDAY: Who ultimately decides on Facebook whether posts are deleted and which accounts are blocked?

Zhang: Ultimately, it is decided by the public policy managers who are also responsible at Facebook for maintaining contact with politicians. There is no firewall between those who enforce the rules and the policy team. Therefore, many of these decisions depend on whether you want to maintain good relationships with the respective government, with the politicians in the country. That in turn depends on how important Facebook takes the respective countries: Germany or India are important to Facebook, so a lot is being done. Smaller countries, on the other hand, are perceived as not important, so not much happens.

WORLD ON SUNDAY: How does Facebook decide which resources to invest in solving problems like fake news or hate?

Zhang: Not very good. Seriously, we always have to remember that Facebook is a company that does not want to save the world, but makes money. So Facebook only seriously cares about the problems if they threaten revenue. In the end, Facebook always prioritizes according to PR considerations.

also read

Jewish organizations in France no longer want to accept Twitter spreading anti-Semitist agitation – – – – –

WORLD ON SUNDAY: How independent and effective is Facebook’s new supervisory body, which is also supposed to make decisions in disinformation conflicts?

Zhang: The appointment is a step in the right direction. But how independently it can decide depends on whether the members can emancipate themselves from the group and how they can build influence and credibility. One problem is that while external people can contact the committee to have decisions about deleting a post revised, Facebook employees with insider knowledge cannot. In addition, the committee lacks direct access to data. They only have as much material, as much authority, as Facebook wants them to be. That is not real independence. And the panel only looks at individual cases, not entire disinformation campaigns. It’s a bit like trying to drain the ocean with a teaspoon. There are just too many problems with Facebook at the same time.

This is where you will find third-party content

In order to interact with or display content from third parties, we need your consent.

– – –

also read

– – – – – – .

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.