Home » today » Technology » More than 16.2 million pieces of content “took action” on Facebook proactively in India during November: Target

More than 16.2 million pieces of content “took action” on Facebook proactively in India during November: Target

“From these incoming reports, we provide tools for users to solve their problems in 461 cases,” the report says.

Social media giant Meta said that more than 16.2 million pieces of content were proactively “processed” on Facebook in 13 rape categories in India during the month of November.

Its photo-sharing platform Instagram proactively took action against more than 3.2 million pieces in 12 categories during the same period, according to data shared in a compliance report.

Under IT rules that went into effect earlier this year, large digital platforms (with more than 5 million users) are required to publish regular compliance reports every month, mentioning details of complaints received and actions taken when respect.

It also includes details of deleted or disabled content through proactive monitoring using automated tools. Facebook had proactively “acted” on 18.8 million pieces of content in October in 13 categories, while Instagram proactively took action against more than 3 million pieces in 12 categories during the same period.
In its latest report, Meta said that Facebook received 519 complaints from users through its Indian complaint mechanism between November 1 and November 30.
“From these incoming reports, we provide tools for users to solve their problems in 461 cases,” the report says.

These include pre-set channels for reporting content on specific violations, self-correcting streams where they can download their data, avenues for addressing account hacking issues, etc., he added. Between November 1 and November 30, Instagram received 424 reports through India’s complaint mechanism. Facebook’s parent company recently changed its name to Meta.

Applications in Meta include Facebook, WhatsApp, Instagram, Messenger, and Oculus. According to the latest report, the more than 16.2 million pieces of content triggered by Facebook during November included content related to spam (11 million), violent and graphic content (2 million), nudity, and adult sexual activity (1.5 million) and hate speech (100,100).

Other categories under which the content was triggered include intimidation and harassment (102,700), suicide and self-harm (370,500), dangerous organizations and individuals: terrorist propaganda (71,700), and dangerous organizations and individuals: organized hatred (12,400).

Categories such as Child Endangerment – Nudity and Physical Abuse saw 163,200 pieces of content in action, while Child Endangerment – Sexual Exploitation saw 700,300 pieces and in the Violence and incitement category 190,500 pieces were performed. “Processed” content refers to the number of pieces of content (such as posts, photos, videos, or comments) that have been acted upon for non-compliance with standards.

Taking action could include deleting a piece of content from Facebook or Instagram or covering photos or videos that may be annoying to some audiences with a warning.

The proactive rate, which indicates the percentage of all content or accounts on which Facebook found and flagged the use of technology before users reported it, in most of these cases ranged between 60.5% and 99 , 9%.

The proactive removal rate for content related to bullying and harassment was 40.7 percent, as this content is contextual and highly personal in nature. In many cases, people must report this behavior to Facebook before it can identify or remove such content. For Instagram, more than 3.2 million pieces of content were held in 12 categories during November 2021. This includes content related to suicide and self-harm (815,800), violent and graphic content (333,400), adult nudity and sexual activity (466,200) and intimidation and harassment (285,900).

Other categories under which the content was acted upon include hate speech (24,900), dangerous individuals and organizations: terrorist propaganda (8,400), dangerous individuals and organizations: organized hatred (1,400), danger of children: nudity and physical abuse (41,100) and violence and incitement (27,500).
In the Child Endangerment – Sexual Exploitation category, proactive actions of 1.2 million pieces of content were carried out in November.

Financial Express is now on Telegram. Click here to join our channel and stay up-to-date with the latest news and updates from Biz.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.