Facebook & Instagram: Failing to Stop Fraudulent Ads & Hate Speech

Social media platforms, including Facebook, Instagram and TikTok, are failing to adequately address the spread of fraudulent advertising, hate speech, and disinformation, according to Professor Miriam Buiten of the University of St. Gallen. Buiten’s assessment, made public on March 3, 2026, highlights a systemic problem where platforms profit from harmful content while their reporting systems prove ineffective.

Buiten argues that the platforms’ claims of being neutral technical providers are misleading. “They are not simply offering a platform; they are actively managing their networks, selecting and moderating content,” she stated in an interview. “Because What we have is integral to their business model, they must also consider the societal costs of their activity.”

The critique comes as a Dutch court recently ordered Meta, the parent company of Facebook and Instagram, to offer users an algorithm-free option, allowing a chronological feed as the default. The October 2, 2025 ruling, as reported by Miriam Buiten, signals a potential shift driven by the Digital Services Act (DSA) and its potential to reshape platform economics. Buiten, in a LinkedIn post, noted that the DSA could necessitate a “fundamental rethinking” of how platforms harvest data and deploy algorithms to maximize user engagement.

Buiten’s forthcoming book on platform liability further explores the implications of the DSA, suggesting that obligations under the act require a re-evaluation of the core principles governing platform operations. The Dutch court case, she argues, is an early test of the DSA’s ability to truly alter platform dynamics.

The concerns extend beyond algorithmic amplification of harmful content. Buiten points to a broader failure of platforms to enforce their own policies regarding illegal and questionable material. This inaction, she contends, poses a risk to democratic processes and societal well-being.

The University of St. Gallen has been actively researching the accountability of social media platforms, with Professor Buiten leading investigations into disinformation and cyberbullying on platforms like Instagram, TikTok, and X (formerly Twitter). The research aims to inform regulatory efforts and legal frameworks surrounding platform responsibility.

While the European Union’s DSA aims to address some of these issues, the effectiveness of the legislation remains to be seen. The Dutch court’s decision represents one early indication of how the DSA might be implemented, but the extent to which users will adopt algorithm-free options remains uncertain.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.