Sunday, December 7, 2025

FACTS IN: FACTS OUT – Protecting News Integrity in AI

News Industry Launches “FACTS ​IN : FACTS OUT” ⁢campaign to Combat AI⁢ Distortion ⁣of News

Geneva, ⁢Switzerland – A global coalition of news ​organizations today launched “FACTS​ IN : FACTS OUT,” a campaign aimed at ensuring the integrity‌ of news content as ⁣Artificial Intelligence (AI) platforms become increasingly central to how people access information. The initiative stems from a recent BBC/EBU report,⁣ “News Integrity in AI Assistants – An International PSM ⁢Study,” which revealed that AI ⁢tools‍ consistently ‌alter, misattribute,​ or strip context⁣ from⁤ trusted news sources, regardless of location, language, or platform.

The campaign‌ calls ​on AI companies to prioritize ‌accuracy and ‍transparency, arguing that while AI holds‍ immense ‍potential, it is currently⁢ an unreliable ⁤source of news. “If ⁣AI assistants ingest facts published by trusted⁢ news​ providers, then facts must ‌come out at the other end,” stated ‌Vincent⁤ peyregne, CEO of ONE-IFRA.

The core concern is the growing reliance on AI ​platforms as gateways to news. Distorted information and obscured sourcing erode public trust, a cornerstone of a⁤ functioning democracy. “For⁣ all its​ power and ⁣potential, AI is not yet a reliable source of news and information -⁢ but the AI industry is not making that a priority,”‍ added Liz Corbin, EBU ⁢Director ​of News.

“If enough organisations ‍endorse FACTS IN :‍ FACTS OUT, ⁤we hope the AI companies ⁣will ‍address the problem urgently,” Corbin continued. “But this is not about finger-pointing; ‌we are‌ inviting the tech companies to engage in a meaningful dialog with us.The public rightly demands access to quality and trustworthy ‍journalism, no ⁢matter what technology⁣ they use, so it’s clear we need to ⁢work together.”

The campaign⁤ outlines five key⁣ principles⁣ for AI developers to uphold:

  1. No consent – no‍ content: AI ⁣tools should only ​utilize news content ⁤with explicit permission ​from⁣ the originating publisher.
  2. Fair recognition: The value​ of trusted news content‍ must be acknowledged when used​ by ⁣AI.
  3. Accuracy, attribution, provenance: AI-generated content must clearly display and allow verification of ‍its original source.
  4. plurality⁣ and​ diversity: AI systems should reflect the breadth⁢ of ​the‍ global news landscape.
  5. Transparency and dialogue: technology companies must openly collaborate with media organizations to establish shared standards for‌ safety, accuracy, and transparency.

News‍ organizations are invited to support the​ initiative ⁣by endorsing the five principles‍ (contact info@newsintegrity.org and submitting⁤ their logo), visiting www.newsintegrity.org for ‍resources, sharing the⁢ BBC/EBU ‌report, engaging with regulators and technology ⁣partners, and ⁢utilizing the hashtag #FactsInFactsOut on social media. ⁤

FACTS IN : FACTS OUT is a component of the broader News Integrity in​ the Age of AI initiative, aiming‍ to safeguard⁣ the future of trustworthy journalism ‌in a rapidly evolving technological landscape. ⁢More information can ⁤be​ found at newsintegrity.org.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.