Help us fight against disinformation

M

Take action to stand for a democratic, inclusive Europe, and against disinformation and hate. We rely on donations for our campaigns & actions, and to develop the tech tools needed for a citizen-driven democracy. 
For long-term impact, consider making a monthly contribution, to shape the future of Europe together. 

Polish Election Country Report 2025

Saman Nazari, Alliance4Europe
Dr. Virginie Andre, Debunk.org
Sophie Sacilotto, Debunk.org
Kinga Margas, Alliance4Europe
Malak Altaeb, Debunk.org
Joel Boehme, Alliance4Europe
Ewan Casandjian, Alliance4Europe
Dr. Jakub Kubś, GLOBSEC
Dr. Maria Giovanna Sessa, EU DisinfoLab

Givi Gigitashvili, DFRLab
ISD Researchers
Maria Voltsichina, Debunk.org
Pavlo Kryvenko, Debunk.org
Laima Venclauskienė, Debunk.org
Kamila Korońska, University of Amsterdam
Larissa Doroshenko, Alliance for Securing Democracy
Peter Benzoni, Alliance for Securing Democracy
Jakub Szymik, CEE Digital Democracy Watch
Martyna Hoffman, Political Accountability Foundation
Aleksandra Wojtowicz, Independent Researcher
Aleksy Szymkiewicz, Demagog Association Poland
Adam Maternik, Demagog Association Poland
Miłosz Dzienio, Independent Researcher
INFO OPS Poland Foundation Researchers (2).
Duncan Allen, Democracy Reporting International
Wojciech Solak, Civic Resilience Foundation
Shiva Shah, The Global Security Initiative
Zachary Horsington, The Global Security Initiative

Polish Election Country Report 2025

Executive Summary

This report provides a critical assessment of Foreign Information Manipulation and Interference (FIMI) during the election period of the 2025 Polish presidential elections. It examines the methods employed, key perpetrators, evolving tactics, and the efficacy of defensive responses.
Our findings, drawn from the collective monitoring and response effort of 28 organisations and the generation of 20 incident alerts, highlight the persistent and multifaceted information threat facing the Polish presidential elections.

A recurring strategic narrative consistently emerged during the election period, portraying EU countries and Ukraine negatively, often accusing them of harming Poland and attempting to manipulate its elections. This narrative strategically positioned far-right Polish politicians as defenders of national sovereignty against perceived external influence.
Key influence information operations targeting the elections included the Doppelganger operation, Operation Overload, and the Pravda Network, all of which disseminated such misleading narratives. Radio Belarus, a sanctioned Belarusian state media, also actively interfered by amplifying ideologically aligned candidates. Lega Artis, Citizen GO, Ordo Iuris, and other foreign-aligned actors were also found to have amplified polarising narratives and promoted candidates that aligned with their interests.

These influence operations manipulated public opinion, often via the use of fabricated online personas and coordinated inauthentic behaviour. They exploited several vulnerabilities in social media platforms, which could be seen as potential systemic risks as defined by the Digital Services Act (DSA). These include the ease of creating accounts on X, the lack of advertisement ‘Know Your Customer’ (KYC) principles on Meta, and inconsistent policy and moderation of murky accounts conducting political campaigns on TikTok.
Finally, the Polish threat landscape was remarkably consistent across the 2017, 2021, and 2025 election campaigns. This presents a notable contrast to other contexts where the threat landscape has exhibited a more dynamic or novel evolution of tactics, while in Poland, such tactics have been a persistent issue for some time.

Beyond detailing tactics employed by FIMI actors, this report exposes the concerning adoption of similar manipulative tools by domestic actors to gain political leverage. Critically, it also highlights that despite interventions from civil society, regulators, and platforms, significant deficiencies in policy enforcement and platform accountability persist. The report concludes with recommendations on improving civil society engagement to counter future FIMI threats.

While the Polish elections were targeted by foreign influence operations and suffered information manipulation incidents, analysis shows that their impact seems to have been constrained by several factors, including public resilience, active civil society responses, and the limited operational sophistication of some campaigns. However, in an increasingly polarised information ecosystem where electoral outcomes can be razor-thin, the imperative for robust national resilience against such threats remains paramount.
This approach acknowledges that resilience is not solely the responsibility of governments or platforms, but requires the active, coordinated participation of all societal sectors – including civil society, media, academia, the private sector, and individual citizens. Consequently, the need for continuous work to foster a whole-of-society resilience against FIMI is compounded.

Persistent incentives and vulnerabilities: the FIMI landscape is shaped by:

  • Poland’s geopolitical stance against Russia, making it a prime target for destabilisation.
  • Exploitation of public support for Ukraine and its refugees to sow internal division.
  • Weaponisation of the Belarusian border crisis to amplify migration tensions and political pressure.
  • Leveraging domestic economic uncertainty and inflation to erode public trust in governance.
  • Deep-seated domestic political polarisation, providing fertile ground for divisive narratives.
  • Existing EU scepticism within segments of Polish society, targeted to undermine European integration.
  • Critical absence of a permanent Digital Services Coordinator, creating a regulatory vacuum.

High-Impact Narrative Amplification:

  • Anti-Ukrainian narratives aim to degrade public support for Ukraine in the context of Russia’s war of aggression and Ukrainian refugees’ influx, fostering social fragmentation.
  • Anti-EU narratives seek to erode trust in European institutions and Polish membership, threatening foreign policy cohesion.
  • Anti-Establishment narratives are designed to delegitimise democratic governance, including the incumbent government as well as public institutions, fostering political instability and public distrust.


Sophisticated FIMI Operations and information manipulation:

  • Coordinated operations – Doppelganger, Operation Overload, the Pravda Network, and the sanctioned Radio Belarus actively disseminated misleading narratives and amplified ideologically aligned candidates.
  • Foreign aligned operations – Lega Artis, Citizen GO, and Ordo Iuris further amplified polarising content.
  • Unattributed or unaffiliated operations, such as covert ad campaigns (unaffiliated), a coordinated inauthentic behaviour network seemingly manipulating TikTok’s algorithm (unattributed), a Nigerian clickbait website (unaffiliated), and murky accounts were also observed engaging in information manipulation. Their objectives included promoting specific candidates, demoting oppositional candidates, and increasing political polarisation.

Unfair Political Actor Conduct: domestic political figures, particularly from the far-right and conservative-nationalist spectrum, were significantly associated with information manipulation. This included the fabrication of personas for self-promotion, the involvement of pro-Russian domestic actors, and the dissemination of false information about election procedures.

Exploited Systemic Platform Risks:

  • X (formerly Twitter) lacks adequate checks on account creation, enabling large-scale coordinated inauthentic behaviour networks.
  • Meta demonstrates weak ‘Know Your Customer’ (KYC) checks in its ad system.
  • TikTok exhibits weak and inconsistent enforcement of its political campaigning policy, allowing circumvention, impersonation, and deceptive campaigns.

Read the full report here.

Policy Implications and Recommendations

The 2025 elections revealed a troubling discrepancy between identified FIMI threats and the adequacy of institutional and platform responses. Bridging this enforcement lag requires immediate and sustained policy action. Only through sustained engagement, robust institutional frameworks, and an empowered, collaborative civil society can Poland effectively safeguard its democratic processes from evolving digital threats and ensure the integrity of its information environment.

Strengthen institutional oversight:

  • Establish a permanent and well-resourced Digital Services Coordinator (DSC) for Poland: this is paramount for ensuring effective DSA implementation, robust oversight of platforms, and seamless cross-sector coordination in threat response.
  • Enhance inter-institutional and cross-sector coordination: develop a permanent incident escalation system to facilitate real-time communication and intervention between government agencies, civil society, and platforms, extending beyond electoral cycles. This includes support for and coordination of initiatives such as the Polish Resilience Council, the Central European Digital Media Observatory (CEDMO), the Counter Disinformation Network, and the FIMI Information Sharing and Analysis Centre (FIMI-ISAC).

Enforce platform accountability and transparency:

  1. Mandate robust systemic risk mitigation: hold platforms strictly accountable for assessing and mitigating systemic risks (per DSA Articles 34 & 35) to prevent interference.
  2. Ensure ad transparency and verification: compel platforms to implement stringent advertiser identity verification (KYC) and immediately halt revenue streams to malign actors (per DSA Article 26 and Political Advertising Regulation).
  3. Address platform vulnerabilities: mandate that platforms patch vulnerabilities like easy account creation for throw-away profiles and ensure consistent enforcement of political campaigning policies, with punitive measures for non-compliance.
  4. Strengthen researcher data access: fully enforce DSA Article 40 to ensure transparent and rigorous data access for qualified researchers; crucial for analysing algorithmic behaviours and content virality.

Build national resilience:

  1. Invest in comprehensive media and digital literacy: implement targeted, age-appropriate programmes to equip Polish citizens with critical thinking skills, fostering informed media consumption and addressing distrust in public institutions.
  2. Sustain civil society funding: provide stable, long-term funding mechanisms for Polish civil society organisations to build capacity and institutional knowledge, ensuring their continued ability to counter systemic threats beyond electoral cycles.
  3. Implement targeted accountability for domestic FIMI amplifiers: develop mechanisms for public attribution and other targeted responses to address domestic figures who knowingly disseminate or echo manipulative narratives, clarifying the line between legitimate political discourse and deliberate deception.

Notes:

  1.  Contributors are researchers who have submitted alerts to the project but have not participated in the writing of this report. The report does not necessarily reflect the opinions of the contributors.
  2. Info Ops Poland Foundation contributed with an alert on the murky ad campaign outlined on page 81, and the report on Russian narratives discrediting political candidates on page 20. Their researchers wish to remain anonymous.

Acknowledgements:

This report was made possible through the FIMI-ISAC project ‘FIMI Defenders for Election Integrity’ and the Counter Disinformation Network infrastructure. Read more about these below.

About the Project:

This report evaluates Foreign Information Manipulation and Interference (FIMI) threats to the 2025 Polish presidential elections. It was developed through the FIMI-ISAC project ‘FIMI Defenders for Election Integrity’. This project consortium brings together FIMI-ISAC members with the unparalleled expertise of 10 organisations to develop a multistakeholder FIMI framework for elections to effectively monitor, respond to and counter FIMI threats before and during elections, while at the same time strengthening FIMI defender communities and democratic institutions. This monitoring and response also involved engaging and coordinating with 10 in-country partners from across Polish civil society and academia. Over the course of these monitoring efforts, the consortium produced a series of incident alerts to be circulated to relevant election stakeholders in real-time. These incident alerts detail key information about FIMI incidents and their impact in the country of focus and provide a set of recommendations for response. Where insights derived from these incident alerts are mentioned throughout this report, they are signposted with an alphanumeric code beginning with ‘IA’.

 

About the FIMI-ISAC:

The FIMI-ISAC (Foreign Information Manipulation and Interference Information Sharing and Analysis Center) is the first ISAC worldwide dedicated to fighting FIMI and creating common standards in this field. It unites a group of like-minded organisations that protect democratic societies, institutions, and the critical information infrastructures of democracy from external manipulation and harm. Through collaboration, the FIMI-ISAC enables its members to detect, analyse, and counter FIMI more rapidly and effectively while upholding the fundamental value of freedom of expression. The FIMI-ISAC does not act independently to counter FIMI. Instead, enhancing collaboration empowers its members to do so more effectively.

https://fimi-isac.org/

About the Counter Disinformation Network:

This report and project were facilitated through the Counter Disinformation Network (CDN).

The CDN is convened by Alliance4Europe and functions as a collaboration and crisis response platform, knowledge valorisation resource, and expert network, bringing together 60+ organisations and 330+ practitioners from OSINT, journalism, fact-checking, academia, policy, and strategic communication from 20 countries.

The network has been used to coordinate projects on 5 elections, providing researchers with an incident alert (IA) template, access to social listening tools, a collaboration and coordination platforms, a shared methodology, and an mailing list of actors who can address influence operations (e.g. regulators, ministries, media, and policymakers).

The network has produced 80+ incident alerts highlighting detected cases of information manipulation to authorities, policymakers, media, and advocacy organisations.

The CDN has so far managed to disrupt hundreds of Russian influence operation assets, address platform vulnerabilities, and highlight systemic risks, all made possible through contributions from a wide range of organisaitons.

For more information on the network, it’s methodology, and funding, please visit:
https://alliance4europe.eu/cdn