Autism article image

 

 

How would you like America to abide by European censorship rules?

 

Brussels’s censorship machine could change what Americans can say online.

 

March 10, 2026

By Wendi Strauch Mahoney

Reprinted from American Thinker

 

According to a 160-page interim staff report released on Feb. 3, 2026, nonpublic documents produced to the House Judiciary Committee under subpoena reveal that the European Commission successfully pressured social media platforms to change their global content moderation rules, directly harming American online speech in the U.S.  The report suggests that the European Commission has been working for a decade to censor speech.

The European Commission says platforms have already changed their systems and interfaces under the DSA.  The law requires easier reporting of illegal content, priority handling for “trusted flaggers,” and statements of reasons for moderation decisions, as well as broader risk-management obligations for very large platforms.  These obligations go beyond obviously illegal content and extend into recommender systems; terms and conditions; and “systemic risks” tied to civic discourse, elections, and public security.  The Commission has used that authority to investigate X’s recommender systems and Grok; to press TikTok for information about elections, media pluralism, and civic discourse; and to pursue Meta over notice-and-appeal mechanisms and researcher access.

The House Judiciary Committee’s focus sharpened in August 2024 after then–E.U. commissioner Thierry Breton threatened X with regulatory retaliation under the Digital Services Act for hosting a live interview with President Trump ahead of the 2024 election.  From there, the committee subpoenaed ten major tech companies.  It later sought records from outside groups, including Access Now and the Institute for Strategic Dialogue, and also requested documents from Stanford after discovering a 2025 event involving foreign regulators and censorship coordination.  The committee has received tens of thousands of pages of nonpublic platform records and regulator communications.

The report’s appendix lays out an extensive exhibit trail divided into Internal Platform Exhibits, External Platform Exhibits, E.U. Internet Forum Exhibits, Hate Speech Code Exhibits, and Disinformation Code Exhibits.  The listed materials include TikTok reports and slide decks; Google internal emails; YouTube and Spotify communications with the Commission; the E.U. Internet Forum handbook; draft hate-speech code materials; and a long run of disinformation-code meeting invitations, agendas, emails, and roundtable summaries stretching from 2020 through 2024.

A Decade of European Censorship

Beginning with the E.U. Internet Forum (EUIF) Code of Conduct in 2016 and continuing through the Digital Services Act (DSA) in 2023, the European Commission held over 100 meetings to pressure platforms to “aggressively censor” content, according to the Committee’s report, “The Foreign Censorship Threat, Part II: Europe’s Decade-Long Campaign to Censor the Global Internet and How it Harms Speech in the United States.”  The measures were sold as cooperative, but the committee says internal emails show the opposite.

The report alleges that

major social media platforms censored true information and political speech about some of the most important policy debates in recent history — including the COVID-19 pandemic, mass migration, and transgender issues, claiming it was combating hate speech and disinformation.

The pressure became concrete in December 2025, when the Commission issued its first noncompliance fine of €120 million for allegedly “breaching its transparency obligations under the DSA.”  X filed an appeal challenging the fine in late February.

The 2018 Code of Practice on Disinformationrevised in 2022, and the EUIF handbook helped define the policies for content moderation.  In addition, a Disinformation Code Task Force was convened to “discuss platforms’ approach to censoring so-called disinformation.”  The task force identified a number of topics including elections, fact-checking, and “demonetization of conservative news outlets.”  The handbook treats lawful but disfavored speech as suspect, including “populist rhetoric,” “anti-government/anti-E.U.” content, “anti-elite” content, “political satire,” “anti-refugee/immigrant sentiment,” “anti-LGBTIQ” content, and even “meme subculture.”

The Committee argues that European speech standards are changing not only local decisions, but also the underlying rulebooks platforms use worldwide.  According to the report, the standards are “neither voluntary nor consensus-driven.”  Because major platforms generally rely on global community standards rather than maintaining a separate set of rules for each jurisdiction, the report contends that pressure from Brussels inevitably spills over into American discourse.  In that context, platforms that resist compliance risk regulatory retaliation, a concern the report suggests is illustrated by the Commission’s fine against X.

The report highlights COVID as one of the clearest case studies proving that pressure from the Commission changed platform moderation standards.  TikTok changed its moderation standards to “censor content questioning established narratives about the virus and the vaccine.”  Notably, European officials allegedly pressed platforms to update terms of service and moderation practices ahead of the vaccine rollout, with the knowledge and approval of top Commission leadership.

One TikTok communication cited by the report said the platform was “monitoring” satire related to vaccinations to determine whether more censorship was needed.  The report further says the Commission pushed platforms to report what they were doing to fight vaccine “misinformation,” and that this pressure reached the point of changing moderation standards before a single vaccine had even been delivered.

The Committee highlights TikTok emails and reports describing the use of COVID-19 and vaccine “Notice Tags” to limit the reach of posts and redirect users toward what the platform called “trusted, authoritative content” from the WHO or government-endorsed local sources.  Internal platform documents show TikTok revised its global Community Guidelines “to achieve compliance with the Digital Services Act.”

Those changes marginalized true information that regulators said was “presented out of context.” Alarmingly, the report states that the European Commission “focused on censorship of U.S. content,” including censoring COVID-19 content and election-related content.

The report also notes that the E.U. “regularly interferes” in E.U. member-state national elections, allegedly “controlling political speech during election periods.”  It has allegedly censored content ahead of elections in Slovakia, the Netherlands, France, Moldova, Romania, and Ireland, according to the report.  This includes the censorship of speech on topics including “migration, climate change, security and defense, and LGBTQ rights.”  From the report pictured below, here are some examples of censorship prior to the 2023 Slovak election:

The Commission issued DSA election guidelines in 2024 requiring platforms to comply with E.U. censorship demands and the “best practices outlined in the Disinformation Code.”  Under those guidelines, the Commission continues to press platforms to

  • reduce the prominence of disinformation;
  • decrease the reach of AI that “depicts disinformation or misinformation”;
  • label posts “deemed” disinformation by “government-approved, left-wing fact-checkers”;
  • develop and apply “inoculation measures that preemptively build resilience against possible and expected disinformation narratives”; and
  • take steps to stop “gendered disinformation.”

The Committee contends that the E.U. “shows no signs of abating” its censorship campaign, citing the X fine as proof positive that it plans to enforce the policies.  It also highlights the November 12, 2025, joint communication issued by the European Commission and the High Representative for Foreign Affairs and Security Policy on the European Democratic Shield (EDS).  EDS is framed as a way to “empower strong and resilient democracies.”  The initiative proposes additional measures in three priority areas aimed at enhancing situational awareness; protecting the integrity of the information space; strengthening operational coordination; and building resilience against foreign information manipulation, interference, and disinformation.  The Committee sees this as further evidence that Europe is trying to export its flawed censorship model to the United States, both directly and indirectly, through the global policies of major platforms.

Related Topics: Censorship