The European Commission (EC) has formally accused Meta Platforms of failing to meet its obligations under the Digital Services Act (DSA). According to the Commission’s preliminary findings, Meta’s platforms, specifically Facebook and Instagram, do not provide a user friendly or easily accessible “notice and action” mechanism by which users can report illegal content such as child sexual abuse material or terrorist content. In particular, the EC points to the use of so-called “dark patterns” in the user interface which may confuse and discourage users from submitting complaints.
The proceedings, which stem from a formal investigation opened in 2024, also address Meta’s internal appeals mechanisms and its transparency obligations. The EC reports that Meta lacks an effective appeal process that allows users to present reasons or evidence in challenging moderation or removal decisions, and that the company has not granted sufficient access to platform-data for independent researchers, a requirement of the DSA for “very large online platforms”.
Investigations that began in 2024, with the EC publishing its preliminary findings in late 2025. Under the DSA, the EU has classed large platforms as critical online infrastructure for the digital economy and democratic process, requiring them to mitigate systemic risks, protect users (especially minors), and uphold transparency from its headquarters in Brussels.
If the EC finalises its decision and finds Meta in breach, the company risks fines of up to 6% of its global annual turnover, or potentially billions of dollars given Meta’s scale. Meta, in response, has stated that it “disagrees with any suggestion we have breached the DSA” and notes that it has made changes to its content-reporting tools, appeals systems and researcher data access in the EU.
For the fintech and broader regulatory risk community, this case underscores how Big Tech platforms are increasingly treated akin to system-critical infrastructure, much like financial institutions, subject to rigorous regulatory guard-rails. As investors or partners in the digital economy, one key takeaway is that business models built on massive user networks must now embed multi-jurisdictional regulatory compliance into strategic planning, with particular attention to regions such as the EU where regulators are moving proactively.
In the broader geopolitical context, the EC’s move signals the EU’s determination to enforce its digital rulebook and reduce regulatory arbitrage by global platforms. It reinforces that the DSA is not simply a voluntary code but establishes enforceable duties of accountability, transparency and user protection. The timing is also notable as digital platforms face heightened scrutiny over disinformation, child safety and algorithmic moderation ahead of elections and emerging technologies such as generative AI.
As the investigation reaches its next phase, market watchers and risk teams should monitor Meta’s remediation proposals, the possibility of binding commitments from Brussels and any final decision. The outcome will set a benchmark for how platforms must organise moderation systems, appeals, data-access and transparency, and thus will shape how investors, regulators and corporate strategists assess exposure in the broader ecosystem of online services.