X Faces EU Probe Over Grok AI Sexualized Deepfakes

EU⁤ Launches Investigation into X Over Alleged Failure to Combat Deepfake Pornography and Illegal Content

The⁤ European ⁣Union has initiated a formal ⁤investigation into X (formerly Twitter) over concerns that the platform ⁤has failed to adequately protect ​European citizens – especially‍ women and children – from illegal⁢ and⁣ harmful content, specifically focusing on the proliferation of non-consensual intimate imagery, including sexually explicit​ deepfakes.‍ The investigation, announced January 26, 2024, is being conducted under the Digital Services Act (DSA), a ⁣landmark EU regulation designed to​ create a safer digital space. https://ec.europa.eu/commission/presscorner/detail/en/ip_24_533

This action marks the latest ⁢escalation in scrutiny of ‌X under the DSA, following previous warnings and requests for facts regarding the platform’s content moderation practices. The EU’s concerns center on whether X has done enough to comply with ⁤its legal obligations to remove illegal content and protect users from harm. Failure to cooperate or address the ⁤identified issues could result in substantial fines – up to 6% of X’s annual global revenue.

Understanding the Digital Services act (DSA) and its Implications

The DSA, which came ⁣into effect in February 2023, represents a‍ notable shift in how​ the EU regulates online platforms. https://digital-services-act.ec.europa.eu/ it establishes a tiered system of obligations based on ​the size and risk profile of online services. Very Large Online Platforms (VLOPs) and‌ Very Large Online Search ⁣Engines‍ (VLOSEs), like X, face the most stringent requirements.

These requirements include:

* content Moderation: VLOPs⁣ must ​have robust systems in place to identify and remove illegal content, including hate ⁣speech, terrorist propaganda, ‍and child sexual abuse material.
* Openness: Platforms must be transparent about their content moderation policies and how they are enforced.
* User Empowerment: Users must have clear mechanisms to report‍ illegal content and appeal content moderation decisions.
* Risk ⁤Assessments: VLOPs are required to assess⁤ and mitigate systemic risks associated with their services, such as the ⁤spread of ‍disinformation and the negative impact​ on mental health.
* Independent Audits: Regular independent ‌audits are mandated to ​ensure compliance with ⁣the DSA.

The ⁣DSA’s focus on⁤ proactive measures and accountability represents a ‌departure from previous regulatory ​approaches, ⁤which often relied on a‌ “notice-and-takedown” system where platforms⁣ only removed content after being notified ‍of‌ its illegality.

The Specific⁣ Concerns Regarding X and‌ Deepfake pornography

the EU’s investigation specifically targets X’s handling of non-consensual ​intimate imagery, with a particular emphasis on deepfakes. ⁢Deepfakes are synthetic media – images, videos, or audio – that have been manipulated using artificial intelligence to depict someone doing or saying something they never did. While ⁤deepfake technology⁤ has legitimate⁣ applications, it is increasingly being used to create non-consensual pornography, often targeting women and children.

“sexual deepfakes of⁤ women and children are a violent, unacceptable ⁣form of degradation,” stated Henna Virkkunen, the EU Commission’s executive vice president for ​tech ⁢sovereignty, security,‌ and democracy, in a press release. https://ec.europa.eu/commission/presscorner/detail/en/ip_24_533 The‍ EU is‍ concerned that X has not adequately addressed the⁢ rapid spread of this harmful content on ⁤its platform.

Several‍ factors contribute to⁢ the challenges of combating deepfake pornography:

* Rapid creation and Dissemination: AI technology makes it easier and cheaper⁤ to create⁢ deepfakes, and social media platforms facilitate their rapid dissemination.
* Difficulty in detection: Deepfakes are becoming⁢ increasingly sophisticated, making them challenging to detect with conventional methods.
* Legal Gray​ Areas: The legal framework surrounding⁣ deepfakes is still evolving, and it can be challenging to determine liability and enforce existing laws.
* Platform Duty: The question of how much responsibility platforms have for content created and shared by users ‍remains a ⁢contentious issue.

X’s ⁢Response and Previous ⁤DSA⁢ Scrutiny

X has previously faced criticism and requests for information from the⁤ EU regarding its content moderation practices. ⁢In September 2023, the ⁢European Commission requested detailed information‍ from X about its⁢ measures ⁤to combat illegal content and disinformation, particularly in the context of the upcoming ‍European Parliament elections. https://www.reuters.com/technology/eu-asks-x-details-about-illegal-content-disinformation-2023-09-05/

The platform responded with a white paper outlining its approach to content moderation, but the EU apparently ‍found the response insufficient, leading to the current formal investigation.⁢ X,

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.