TikTok does not transfer data from Europe to China, according to those responsible

Last April, TikTok announced the opening of a Center for Transparency and Responsibility in Europe, opening access to experts, academics and political leaders, with the aim that they can obtain information on the review of the platform’s content, human teams that ensure safety, as well as the guarantee of it through technology. the same plans to have its doors open, physically, in 2022 and will be located in Ireland.

Business Insider España has been one of the invited media to visit, virtually, this center, and understand first-hand how TikTok ensures privacy, the safety of users, apart from understanding some controversies that have plagued the company, such as complaints by users and consumers.

Recently, for example, the Luxembourg National Data Protection Commission (CNPD) imposed a fine of up to 350 million euros on Amazon, for an alleged series of violations of the General Data Protection Regulation (GDPR). This same fact is what TikTok wants to avoid and, for this, it has created this center, which is added to the usual transparency report of the video platform.

In this sense, Business Insider España has had, exclusively for a Spanish media, the opportunity to ask about these issues and some current issues of the company to Cormac Keenan, Head of Trust & Safety; Roland Cloutier, Global Director of Security (CSO); Jade Nester, Director of Public Data Policy for Europe, and Michael Satyapor, Director of Product.

There is no data storage of European users in China

In March of this year, the writer Chris Stokel-Walker investigated whether the data of users from outside China was being sent to that country, after Helen Dixon, the Irish data protection commissioner, assured in a conference virtual that the engineers of the Asian giant “could” be accessing the data of European users of TikTok.

However, Cloutier has denied this point, mentioning that your data is only in the United States and Singapore. “In fact, TikTok is not even available in China,” he added. Thus, its security team is based in the United States and has a “less privileged access model.” This concept translates into they can only access the data necessary to carry out their work.

TikTok's data security centers are located in the United States, Singapore and, in 2022, in Ireland.

“Based on that, we also have a usage management policy that says when [el empleado de seguridad] you’re doing any kind of work, no matter where you operate, that access will be removed, “he added.” And of course we have hundreds of security professionals around the world who make sure that happens; so a very rigorous protection program has been put in place. “

In relation to this point, Nester has emphasized that, to the extent that there are teams around the world, there may be data transfers involved in accessing these. “In this context, if we are talking about transfers from the European Union, we comply with the standard contractual clauses,” he explained. “We have made a significant investment of time and effort in reviewing and evaluating all of our data transfer practices.”

The guarantee of protection of European users does not end here, but TikTok intends to take it further, unlike other companies such as Facebook that are offering a battle in this area with the GDPR. Thus, TikTok works side by side with multinational security consulting agencies that help them develop new technologies, apart from offering new certifications and accreditations that show users that their data is fully protected.

TikTok responds to complaints about underage users

Another issue of vital importance for the different social networks is, apart from the protection of European user data in relation to the RGPD, the strict surveillance of the content that is offered to minors. At the beginning of this year, Business Insider created a fictitious 14-year-old user, who in just 8 minutes began to show ads about rhinoplasty and cosmetic surgeries.

In April of this year, BEUC, a European consumer organization, denounced TikTok for not adequately guaranteeing protection to underage users, in the field of advertising, whose response by the company will be offered during this same month of June.

Biden revokes Trump’s orders that blocked TikTok and WeChat, and defines new criteria to evaluate whether an app carries risks for the population

Faced with this event and others that TikTok has faced in relation to minors, the video platform has taken strict measures, according to Nester, such as the It is compulsory for all accounts belonging to minors under 16 to be private by default, without access to direct messaging: “We have strict policies on prohibiting any advertising that directly appeals to users who are under the age of digital consent.”

In addition, Nester has ensured that they welcome conversations such as those made by the BEUC, as TikTok is engaged with different regulators and other stakeholders in consumer protection and transparency. Specifically, with the Irish Consumer Protection Commission and the Swedish Consumer Agency.

As a result of this commitment, as confirmed by Keenan, the number of underage accounts that have been deleted has been increased, a piece of information that will be specified in the next TikTok Transparency Report. Within the scope of this transparency guaranteed by those responsible for TikTok, the visit to the center has yielded some figures that describe the overwhelming reality of moderation on the platform.

Less than 1% of videos were deleted

The TikTok Community Rules are quite clear, obviously prohibiting any content that attacks natural persons, such as racist, sexist posts, etc. However, with more than 100 million European users making use of the platform, they find tens of millions of publications, of which only 1% violated these rules, according to him Transparency Report for the second half of 2020.

Figures on deleted videos on TikTok.

Thus, in said semester, 89,132,938 videos were specifically deleted. Among these, 92.4% could be identified before any user reported on them, while 83.3% were eliminated even before they received visits, apart from 93.5% it was moderated in the following 24 hours to its publication. All this thanks to the collaboration of technology and human teams.

In this way, the section of For you It is configured specifically for each user at the individual level, so that the contents that inflict the Rules are not shared and hardly viewed.

“Each person is presented with a series of specially selected videos, which makes it easy to discover new and exciting content and creators,” explained Satyapor. “This process is driven by our recommendation system, which uses a series of indicators to suggest a combination of relevant, diverse and safe content.”


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.