In Australia, CNN ceased publishing content to Facebook in February 2021 after an Australian court ruled that media outlets were liable for defamatory comments posted by users on their Facebook pages. The decision stemmed from a case brought by former detainee Dylan Voller, who was depicted in images circulated on Facebook and subjected to abusive comments.
The court’s ruling, as reported by Gizmodo, implied that any participation in communicating defamatory material could render a party a publisher, even if they were unaware of the comments or subsequently deleted them. CNN sought to have Facebook disable commenting functionality to mitigate the risk, but Facebook declined. This led to CNN’s withdrawal from the platform in Australia.
The situation echoed broader concerns about the responsibilities of online platforms for user-generated content, and specifically highlighted the protections afforded to those platforms under Section 230 of the Communications Act of 1934 in the United States. Enacted in 1996 as part of the Communications Decency Act, Section 230 generally provides immunity to online computer services from liability for content posted by their users. According to a summary of the law, Section 230(c)(1) states that a provider or user of an interactive computer service “shall not be treated as the publisher or speaker of any information provided by another information content provider.”
The Australian case underscored the potential consequences of removing such protections. Without Section 230-style immunity, platforms could face significant legal exposure for the actions of their users, potentially leading to increased censorship or the shutdown of online forums. The Gizmodo report framed the Australian ruling as a warning of what could occur if U.S. Lawmakers weakened similar protections.
The debate over Section 230 has been ongoing for years, with critics arguing that it shields platforms from accountability for harmful content, while proponents maintain that it is essential for fostering innovation and free speech online. The Australian court decision added fuel to this debate, demonstrating the practical implications of holding platforms liable for user-generated content.
The case likewise highlighted the challenges of content moderation at scale. Even with proactive efforts to remove objectionable material, platforms struggle to identify and address all harmful content. The Australian court’s ruling suggested that platforms could be held liable even for comments they were unaware of, further complicating the task of content moderation.
In the United States, legal challenges to Section 230 have continued. In February 2021, a court dismissed a lawsuit brought by former Congressman Devin Nunes against CNN, which had been characterized as a Strategic Lawsuit Against Public Participation (SLAPP) suit. This dismissal occurred around the same time as the events unfolding in Australia, further illustrating the ongoing legal battles surrounding online speech and platform liability.