Privacy is one of the flags of Facebook Inc. and all its social networks. At least that is what its top executives say and it is also indicated in their policies.
(Fine to WhatsApp in Europe for handling user data).
“No one outside of this chat, not even WhatsApp, can read or listen to the messages”, it is read in absolutely all the conversations that each user has on the platform. This is due to an end-to-end encryption that became a flagship of the app and a demonstration of the step to absolute privacy that Mark Zuckerberg and Facebook have wanted to take in recent years.
Also, in 2018, in testimony to the United States Senate, Zuckerberg, as president of Facebook, assured that they “They do not see the content that is sent through WhatsApp.”
(Do not run out of WhatsApp: cell phones that will stop working).
But how true is this? Are they 100% protected? How effective is end-to-end encryption? A new report from ProPublica revealed that there are WhatsApp teams that not only read and review messages, but also modify them in some cases or even pass them on to justice organizations for prosecution.
MORE THAN 1,000 EMPLOYEES
“WhatsApp has more than 1,000 contractors who are in offices in Texas, Dublin and Singapore. Sitting at their jobs, these employees use Facebook software to sift through millions of private messages, images and videos.”reads ProPublica’s report titled ‘How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users’.
These thousand employees review the messages that appear on their screens and also give their judgment on the content. But beware, they do not review all the messages that are sent, beyond that it would be impossible.
These workers end up giving their concept on the millions of messages, photos and videos that were cataloged and reported by users as “potentially abusive” (child pornography, spam and hate).
“Workers have access to only a subset of WhatsApp messages, which users automatically flag and forward to the company as possibly abusive. The review is one element in a broader monitoring operation in which the company also reviews material that is not encrypted, including data about the sender and his account “, the report reads.
This is one of the many strategies of the companies behind Facebook Inc.’s social networks to offer safer environments and prevent any type of abuse. But what then is the problem?
The first thing, as Carl Woog, WhatsApp communications director, told ProPublica, is that the social network does not consider this work as a “content moderation”, something that is contemplated between the practices of Facebook and Instagram, for example. The big difference is that in the cases of the latter two the messages are not encrypted.
In addition, these social networks have reports of content removed and openly cataloged as offensive. For WhatsApp there are no such reports.
On the other hand, ProPublica also indicated that WhatsApp shares certain private data with judicial entities, such as the US Department of Justice.