CHANGE LANGUAGE

TikTok and Meta under EU scrutiny: violations of the Digital Services Act due to lack of transparency and ineffective content reporting mechanisms

The European Commission accuses TikTok and Meta of violating the Digital Services Act: denied access to researchers, complex reporting, and ineffective appeals for users.

TikTok and Meta under EU scrutiny: violations of the Digital Services Act due to lack of transparency and ineffective content reporting mechanisms.

The European Commission accuses TikTok, Facebook, and Instagram of obstructing researchers' access to data and failing to provide users with adequate tools to report illegal content or challenge moderation decisions.

Brussels – The European Commission today announced its preliminary findings according to which TikTok and Meta (the parent company of Facebook and Instagram) are said to have violated several obligations imposed by the Digital Services Act (DSA), one of the most important European Union regulations regarding transparency and protection of online users.

According to what has emerged, the platforms would not have guaranteed adequate access to public data for researchers, thus hindering the possibility of analyzing the risks associated with the dissemination of illegal, harmful or potentially dangerous content for minors. At the same time, the Commission criticizes Meta for the lack of clear and accessible mechanisms for reporting illegal content and for filing of appeals against moderation decisions.

Data Access: Obstacles for Researchers and Lack of Transparency

Preliminary investigations have revealed that Facebook, Instagram and TikTok they would have implemented complex procedures and expensive tools to allow researchers to access public data, often providing only information partial or unreliable.
A practice which, according to Brussels, compromises the ability to study and monitor the behaviour of platforms, especially in relation to the exposure of users – particularly minors – to illegal or harmful content.

La Digital Services Law considers access to data by researchers a key element of transparency and public accountability, essential for evaluating the impact of large digital platforms on mental health, public debate and online safety.

Targets in focus: difficult reporting and "dark patterns"

As regards Meta, the European Commission denounces that neither Facebook nor Instagram would offer users a simple and intuitive “alert and action” mechanism to report the presence of illegal content, such as child pornography or terrorist propaganda.

On the contrary, the tools in use today would impose complex and unnecessary steps, discouraging users from reporting. Furthermore, according to the investigation, Meta would make use of so-called “dark patterns”: interfaces designed to confuse or discourage users, effectively rendering reporting systems ineffective.

The Commission underlines that, under the DSA, platforms that do not promptly remove illegal content they lose their liability exemption provided by law, exposing oneself to sanctions and legal proceedings.

Ineffective appeals and limitations on users' freedom

Another critical point concerns the appeal mechanisms offered by Meta. According to the Commission, EU users do not have access to effective tools to challenge content removal or account suspension.
Facebook and Instagram's internal systems would not allow users to present evidence or explanations in support of appeals, thus limiting the possibility of defending one's freedom of expression.

The preliminary conclusions are also based on the cooperation with the Irish coordinator for digital services, Coimisiún na Meán, who collaborated in the investigations on Meta.

The next steps of the investigation

The companies involved will now have the opportunity to examine the investigation documents and to respond to complaints in writing. TikTok and Meta may also take corrective measures to remedy the alleged violations.

In the meantime, the Commission will consult the European Digital Services Committee, with a view to a final decision.
If the violations are confirmed, Brussels may issue a decision of non-compliance and impose fines of up to 6% of annual global turnover of the companies, in addition to daily penalties in case of non-compliance.

From October 29, 2025, furthermore, a new law will come into force delegated act on access to data, which will also guarantee researchers access to non-public data coming from large platforms and search engines, further strengthening democratic control over the digital ecosystem.

The Commission's comment

Executive Vice President for Technological Sovereignty, Security and Democracy, Henna Virkkunen, commented:

Our democracies depend on trust. This means that platforms must hold users accountable, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice. With these actions, we are ensuring that platforms are accountable for their services, as guaranteed by EU law, to their users and to society.

The European Commission's action against TikTok and Meta represents a decisive step in the full implementation of the Digital Services Act, the regulatory pillar with which the EU aims to regulate large online platforms, ensuring transparency, security and protection of fundamental rights.

In a context where digital information and algorithms are increasingly influencing public life, the investigation opened by Brussels reiterates a clear principle: Big tech must be transparent, accountable, and subject to democratic oversight.

Follow La Milano on our Whatsapp channel

Reproduction reserved © Copyright La Milano

×