Home » World » The victim violates the sex, the child begged Elon Musk to let her image in the X -BBC NEWS Thai.

The victim violates the sex, the child begged Elon Musk to let her image in the X -BBC NEWS Thai.

X‍ Faces Scrutiny as Victim of Image Abuse Appeals to Elon​ Musk

San Francisco – Social media platform X, formerly Twitter, is⁢ under renewed​ pressure to combat⁣ the proliferation of ⁢child sexual abuse material (CSAM) following a case where a victim directly appealed to owner Elon Musk for help. The victim, identified as “Saura,” discovered‌ her images were being traded on the platform and pleaded with Musk to intervene, stating, “Our harassment is sharing, trading and exchanges‍ on‍ the apps you own. ‌If it’s your own child You ​probably protect without ⁣hesitation. I begging you to do the same ​as us. Now is the time to do.”

The case highlights ongoing concerns about the speed ‍with which perpetrators can re-establish accounts ‌on social media platforms after being banned for distributing CSAM.⁤ Lloyd Richardson, from Canada’s Children’s Protection Center (CCCP), explained that simply deleting accounts is insufficient. “We send a notification letter to‍ the platform. Then they​ delete the account. It⁤ is indeed just the minimum measure,” he said, noting that‌ users can frequently ​enough return with new accounts “within a few days.”

X maintains it ​has a strict policy‍ of “absolutely do not accept[ing]” the ‌pursuit of sexual benefits from children and invests in “advanced detection systems” to quickly remove violating ⁣content and ‌accounts. A platform spokesperson ⁢stated they work “closely” with the national Center for Missing and Exploited children (NCMEC)⁤ and support law enforcement ‌in prosecuting these crimes.

Other platforms are also facing scrutiny. Telegram reported having banned over⁢ 565,000 ⁤instances of CSAM dissemination in 2025 and stated that more than a thousand moderators work on the issue. Telegram‌ also proactively monitors and removes inappropriate content before it reaches users or is reported.

Despite these efforts, advocates ​argue that more intensive measures are needed across all social media platforms ⁢to prevent the repeated posting of CSAM and protect vulnerable⁣ individuals. The incident with “Saura” underscores the urgent⁢ need for effective solutions to ​address this pervasive online problem.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.