Meta Faced Internal Policy Limiting Action on Suspected Sex Trafficking Posts, Court Documents Reveal
WASHINGTON - internal Meta documents revealed in a recent court case suggest the company previously operated with a policy of taking action on suspected sex trafficking posts only after 17 separate reports, a practice critics say significantly delayed intervention adn potentially endangered victims. The details emerged during legal proceedings concerning allegations that Meta failed to adequately protect children on its platforms.
According to testimony from former Meta engineer Tracy Jayakumar, building a system for users to report child sexual abuse material (CSAM) within Instagram in 2020 would not have been a substantial undertaking. “You essentially add an additional option to the existing number of reporting … options that are out there,” she stated in a court document. At the time, Instagram already facilitated direct reporting of less severe violations like spam, intellectual property infringement, and promotion of firearms.
Meta publicly acknowledged identifying illegal child exploitative content on Facebook and Instagram in October and November 2020,detailing its findings in a Febuary 2021 blog post. Following this discovery, the company said it began “developing targeted solutions, including new tools and policies to reduce the sharing” of such content, including pop-up resources for users encountering the material.
Instagram now provides instructions for reporting child sexual abuse and states on its “Facts for law enforcement” page that all apparent instances of child sexual exploitation are reported to the National Center for Missing and exploited Children (NCMEC) and, where applicable, to law enforcement agencies worldwide. NCMEC then refers cases to appropriate authorities.