Sunday, December 7, 2025

Title: EU Guidelines Protect Minors Online Under Digital Services Act

Digital Safety Under‍ Scrutiny: EU Commission Investigates Protections for minors Online

Brussels,​ Belgium‍ – October 17, 2025 ⁣ – The European Commission has launched formal investigations into the child safety ‌measures employed⁣ by major digital platforms, including YouTube, apple’s App ⁢Store, and ⁤Google play. The ‌inquiries center on ensuring adequate safeguards⁣ are in place to prevent underage access to harmful content and illegal goods, and also to mitigate risks associated with age-inappropriate ⁢applications.

The Commission’s actions ‌respond⁢ to growing concerns about the vulnerability of ‍young people in the digital landscape. With increasing digitalization across all age groups,authorities are prioritizing the verification that essential protections for minors are consistently upheld. The investigations will assess how each platform enforces age restrictions – notably the prohibition of⁤ services for users under 13 – and how effectively they prevent the distribution⁣ of illicit items, such as those related to gambling or ⁣the creation of⁣ non-consensual intimate⁤ imagery.

Specifically, the‌ Commission is examining YouTube’s age‌ verification systems and the algorithms governing content recommendations to users, prompted ⁢by reports of minors being exposed to damaging material. For Apple and Google’s app stores, ⁤the focus is on ‌the processes ‌used to categorize applications ‍by age rating and the potential for minors to encounter harmful apps,‌ including those facilitating ⁣gambling or generating sexualized content (like “nudify” apps).

These investigations follow a broader trend of increased regulatory attention ⁣on digital platforms’ responsibilities ​regarding user safety. The Digital Services‍ Act (DSA), which came into ‍full ⁢effect in February 2024,​ already mandates⁣ platforms to address illegal content and protect ⁣fundamental rights online. The current commission inquiries represent a⁣ focused effort⁢ to​ ensure these obligations are being met with respect to the unique vulnerabilities of young users. Outcomes ⁣of the ​investigations could include demands for improved age⁤ verification,‍ algorithmic transparency, and stricter content moderation policies.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.