CHANGE LANGUAGE

The European Union pushes for a safer internet: new rules to protect minors online

The European Parliament proposes new rules to protect minors online: a minimum digital age of 16 for accessing social media and artificial intelligence, a ban on targeted advertising, manipulative algorithms, and gambling-like game mechanics. Schaldemose (S&D): "A step forward for a safe and transparent internet, where young people are no longer commodities for platforms."

The European Union pushes for a safer internet: new rules to protect minors online.

The European Parliament proposes a minimum digital age of 16 for accessing social media and artificial intelligence systems. Fines, bans, and penalties for platforms that fail to protect young people. Schaldemose (S&D): "It's time to build a child-friendly internet."

Brussels, 16 October 2025 – Minors must be able to navigate online without risking their mental health or becoming victims of manipulative content and unfair commercial practices.
This is the strong message launched by the European Parliament, which approved in the Committee on the Internal Market and Consumer Protection (IMCO) a new Report on measures to make digital services for minors safer.

The document, adopted with 32 votes in favor, 5 against and 9 abstentions, asks the European Commission to strengthen the protections provided by the Digital Services Act (DSA) and introduce new legislative instruments to combat addictions, misinformation, and the economic exploitation of young people by large online platforms.

Minimum digital age and age verification: towards a Europe of 16

At the heart of the proposal is the introduction of a Uniform digital minimum age of 16 years to access social networks, video-sharing platforms, and conversational artificial intelligence applications—such as chatbots or “AI companions”—without parental consent.
However, for children under 13, access to social media would remain completely prohibited.

MEPs also called for the development of privacy-friendly age verification systems and the fundamental freedoms of minors, stressing that these instruments must in no way relieve platforms of their primary responsibility: to design services that are safe from the outset (“safety by design”).

“We need to raise the bar on child protection,” said the rapporteur. Christel Schaldemose, Danish Social Democratic (S&D) MEP.
"We propose a common minimum age of 16 and a ban on the most harmful practices. Online safety must be a right, not a luxury."

Stop targeted advertising, toxic algorithms, and addictive design.

Parliament calls on the Commission to take forceful action against persuasive and manipulative techniques that dominate the digital market: targeted advertising, addictive design, engagement-based recommendation algorithms and game-like mechanisms “gambling”.

In particular, the resolution calls for:

  • ban loot boxes in video games accessible to minors, practices similar to gambling are considered;
  • block by default the most addictive features, such as infinite scrolling, timed stories and autoplay;
  • prevent the use of recommendation algorithms based on personal profile of minors;
  • prohibit the monetization of “kidfluencing”, that is, the use of minors as influencers for profit;
  • Penalize apps that allow the generation of manipulated or sexually explicit images through artificial intelligence without consent.;
  • ban manipulative chatbots and impose stringent rules on the use of conversational artificial intelligence.

The Parliament also suggests the possibility of introducing personal responsibility for managers of large platforms in the event of serious and repeated violations of the rules on the protection of minors.

Rapidly implement the Digital Services Act

According to MEPs, the Digital Services Act – the new European regulatory framework to regulate online platforms – must be applied more quickly and decisively, with severe fines for companies that fail to comply with their obligations to protect minors.
In case of serious breaches, the Commission should be able to go as far as ban on operating in the European market for services that put young users at risk.

The report highlights that many digital platforms, despite the new rules in force since the beginning of 2024, they continue to fail to provide safe environments, exposing millions of teenagers to violent, sexual, or misinformational content.
Parliament therefore calls on the Commission to exercise full powers of control and sanction, without hesitation.

Persuasive technologies and “dark patterns”: the new frontier of protection

The IMCO text highlights the need to also address the “dark patterns” —those psychological mechanisms that induce users to engage in unconscious behaviors, such as clicking misleading buttons or staying online for as long as possible.
MEPs are calling for these practices, along with personalised advertising and influencer marketing, to be included in the future Digital Fairness Act, a new European law on digital fairness.

Among the techniques considered most dangerous are infinite scroll, autoplay e game dynamics that encourage compulsive buying or addiction.
All strategies, according to experts, designed to retain the attention of young people and increase the time they spend online, often with damaging consequences on mental health.

A growing problem: addiction, misinformation, and mental health

The report is part of a context of growing concern on the impact of social media on children's health.
Un new Eurobarometer survey published today shows that young Europeans spend on average over three hours a day on social media, with an ever-increasing exposure to false, violent or sexually explicit content.
Many claim to feel "employees" from notification mechanisms and recommendation algorithms.

Schaldemose emphasized that "digital platforms design their services to generate dependency and profit, not to protect users. It's time to reverse this logic."

Next steps: Plenary vote in November

The European Parliament will officially vote on the recommendations on the online safety of minors during the Strasbourg plenary session of 24-27 November 2025.
If approved, the text will form the basis for new European legislative initiatives aim to make the Internet a safer, fairer and more transparent place for children and adolescents.

With these measures, the European Union aims to redefining the responsibility of digital platforms towards minors, imposing not only respect for the law, but also a new digital design ethics.

Follow La Milano on our Whatsapp channel

Reproduction reserved © Copyright La Milano

×