EU Disinformation Code Takes Effect Amid Censorship Claims and Trade Tensions

As of July 1, 2025, Europe’s Code of Conduct on Disinformation is officially in effect. What was once a voluntary self-regulatory framework is now locked into the Digital Services Act (DSA), requiring the Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to meet tougher transparency and auditing obligations aimed at stamping out disinformation. Full compliance with the Code now counts as a key risk-mitigation measure and marker of DSA compliance. And come audit time, tech platforms will have to prove they’re sticking to their commitments – or face scrutiny from Brussels.

The Code comes into effect just ahead of high-stakes trade talks between the EU and the US, with a July 9 deadline looming. But the EU has so far held firm. “The DSA and the DMA are not on the table in trade negotiations,” a Commission spokesperson said Monday, adding “we are not going to adjust the implementation of our legislation based on actions of third countries. If we started down that road, we’d have to do it with many countries.”

This pushback isn’t limited to the EU. Canada is facing similar heat from the US after introducing its own digital services tax on American Big Tech firms, which President Donald Trump condemned as “obviously copying the European Union.” Joel Kaplan, Meta’s chief global affairs officer, was quick to praise Trump for “standing up for American tech companies in the face of unprecedented attacks from other governments.” Shortly thereafter, trade talks with Canada were abruptly suspended – until Ottawa scrapped its digital tax.

While the EU steps up enforcement to hold platforms accountable and protect public discourse, Brussels finds itself navigating a rising tide of censorship accusations, especially from Washington, where MAGA-aligned officials are watching closely – and tech platforms are rallying behind them.

Censorship or systemic risk?

One of the central anxieties around the Code’s elevation into a DSA instrument is whether it undermines freedom of expression. While the EU Commission has consistently framed the Code as a voluntary mechanism, its transformation into a compliance tool under Article 35 of the DSA means that failing to adhere to its commitments may now trigger investigations or fines.

In May, Rep. Jim Jordan (R-OH), Chairman of the House Judiciary Committee, and four other congressmen sent a letter to EU Commissioner Michael McGrath. They argued that since the DSA requires platforms to systematically censor “disinformation” and most companies won’t create separate moderation systems for Europe and the rest of the world, the DSA could set de facto global censorship standards, restricting Americans’ online speech.

The European Commission maintains this is a misreading of the law. “The [Disinformation] Code is not about censorship,” Thomas Regnier, a Commission spokesperson, told Tech Policy Press in an email. “On the contrary, it is a framework aiming to create a transparent, fair and safe online environment, while fully upholding the fundamental rights of users, including freedom of expression.” Freedom of expression, the spokesperson added, “lies at the heart of the DSA.”

The Commission emphasized that the distinction lies in the DSA’s structural focus. Rather than going after individual pieces of content, the law zeroes in on transparency, accountability, and systemic risk, targeting opaque recommender algorithms and ad networks that shape what users see.

“The Code of Practice on Disinformation is not geared toward content removal,” Regnier said. “Its commitments aim to protect against manipulation of online platforms, giving users more context and tools to navigate safely – not suppressing content.”

“Content moderation does not mean censorship,” the spokesperson added. “The DSA requires platforms to be transparent about moderation practices, including ‘shadow banning,’ and empowers users to challenge decisions.”

For Clare Melford, CEO of the Global Disinformation Index (GDI), the “censorship” framing is not only flawed – it’s deliberate. “Trying to say governments are censoring is a fundamental misunderstanding of how technology works,” she said. “The speech that is actually being suppressed is moderate speech, because it’s less profitable.”

Visited 168 times, 1 visit(s) today
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments