Coordinated disinformation network uses AI, media impersonation to target German election

13 February


With less than two weeks before the German federal election, ISD has discovered a coordinated network on X (formerly Twitter) spreading disinformation about German politicians and election-related terror threats. The network of roughly 50 accounts, which shares traits with the pro-Russia campaign Operation Overload, disseminates false claims through videos designed to look like they come from media outlets, law enforcement agencies and academics. At times, it uses AI-generated audio and visuals.

A bot network of more than 6,000 accounts amplifies the videos, often tagging media outlets or fact-checkers in their posts. The volume of new videos increased dramatically in the second half of January 2025. However, this increased output has not resulted in more organic engagement, with bots responsible for nearly all shares. Despite focusing on the German election, none of the videos or posts are in German. This suggests the network intends to overwhelm fact-checkers and mislead international observers rather than influence Germans themselves. 

Key Findings

  • A network of at least 48 X accounts engaged in a campaign targeting Germany’s federal elections on 23 February 2025. This network became active in November 2024, initially focusing on allegations of rising antisemitism in Germany. Since 10 January 2025, it turned its attention to the elections.
  • A secondary network comprised of more than 6,000 accounts is simultaneously reposting the content, ensuring that the original network’s videos are shared thousands of times within minutes. This pattern is strong evidence of coordinated inauthentic behaviour (CIB).
  • Nineteen of the total 33 videos posted in 2025 so far were created between 27 and 30 January, indicating that the network may be escalating its activity.
  • Recent videos posted by the network feature disinformation about election-related terror threats and false claims about German politicians, particularly allegations of corruption and paedophilia. Among the main targets of the campaign are Friedrich Merz, the current candidate for chancellor from the Christian Democratic Union of Germany (CDU) party (centre-right); Janine Wissler, a Die Linke politician (left-wing); and Armin Laschet, the former CDU chancellor candidate now running for parliament (centre-right).
  • Videos shared by the network carry branding from legitimate media organisations, including Deutsche Welle (DW), BBC and Sky News. They also imitate government agencies and academic institutions. The operation has impersonated at least 20 organisations since the beginning of 2025, at times using AI to manipulate the audio of real videos or adding captions featuring false claims. The network’s tactics – impersonating legitimate organisations in videos, using AI, posting QR codes, and tagging journalists and fact checkers – are commonly used by a Russia-aligned information operation known as ‘Operation Overload,’ also referred to as Matroyshka.
  • Despite these efforts, the campaign appears to have had limited impact, with the majority of its engagement coming from this secondary bot amplification network. However, the 48 accounts ISD initially identified have collectively received 2.5 million views, with the network’s engagement tripling in January.
  • The network is sharing content in several languages, including English, Spanish, and Arabic, but not in German. As such, it is likely that the aim of this campaign is to undermine trust in German elections among international audiences. This is consistent with information manipulation strategies employed by the Kremlin in recent European elections.

Network Activity

Initial posts from the network mentioning Germany largely focused on allegations of rising antisemitism in the country. A notable spike came on 18 November, when seven posts featured claims such as ‘German users leave the most antisemitic comments on social media’ and ‘Germans refuse to buy homes near where Jews live’. These posts were then amplified by thousands of bot accounts. While the exact cause of this spike is unclear, the German Bundestag passed a resolution to combat antisemitism on 6 November that proved highly controversial.

Figure 1: Shares of content from the coordinated network focusing on Germany from 1 November 2024 – 30 January 2025.
Made with Flourish

Since mid-January 2025, however, the network’s focus shifted towards the German elections. At the time of writing, a total of 33 videos had mentioned this topic compared to just one in the previous month. A notable spike in the network’s activity occurred between 27 and 28 January, when 13 out of the 33 videos were posted. Each was shared by hundreds of amplifier accounts within a minute of publication, suggesting CIB. A further six videos about the German elections were posted on 30 January.

Figure 2: Shares of network content per minute on 27 January 2025 between 2pm and 8pm CET.

Narratives 

ISD identified three main narratives being propagated by the network:

1. Allegations of terrorism and claims that election security is under threat in Germany  

  • Twenty-two posts published in January contained false and misleading claims about election-related terrorist threats. These include fabricated warnings purportedly from foreign intelligence services instructing citizens not to travel to Germany during the election. Some videos feature voter suppression narratives, including claims that election sites or individual voters are likely targets of terrorist attacks, ranging from poisoned envelopes to bombings. Discussions of alleged terrorist plots mentioned the Islamic State (IS). One video falsely claimed that a former intelligence official called for postponing the federal election. ISD found no evidence that supports any of these claims.
  • One example uses footage from a real video published by the West Midlands Police in the UK, where a special constable discusses his work. A likely AI-generated voice has been added to the video half-way through, which says: “our information indicates that a series of major terrorist attacks are prepared in Germany during the early election. We passed on all the information to our German colleagues. Even if this information does not help to prevent terrorist attacks”.
  • Claims made in the videos range from unlikely to outlandish, and the general tone is dark and threatening. One video, which received more than 60,000 views, claims that 68 percent of Germans are too afraid to leave the house on Election Day. Germany is portrayed as unable to protect its citizens and ensure the integrity of its elections. ISD found no evidence of these claims in statements from German officials or in the media.

2. Attacks on German political candidates and endorsement of others

  • Ten videos target political candidates, mostly from the central-right CDU & CSU parties. These individuals are accused of alleged illegal activities with financial misconduct and child abuse being the most prominent.
  • These allegations are presented through audio and/or subtitles that appear to have been added retrospectively to video reports from credible media outlets. These retrospective additions typically manipulate the original message in the original video, leading viewers to false conclusions. At times the original audio was altered by AI.
  • In some cases, videos cite false past scandals or controversies to further reinforce the accusations. One of the videos also endorses the far-right Alternative für Deutschland (AfD) as “the only party truly concerned for German citizens”. It also claims that the German elections will be “the dirtiest and most deceitful manipulation in the history of mankind”.

3. Content discrediting Ukrainian refugees and lamenting German aid to Ukraine

  • Seven posts from accounts in the network target Ukrainian refugees in Germany and German aid to Ukraine. They claim that Ukrainian refugees present a problem for the German government and accuse them of committing crimes. One video, impersonating the Euronews TV channel, alleges that Germany is preparing bomb shelters for a possible Russian attack and that German citizens blame their government for weakening national security through its support for Ukraine.

Network Tactics and Traits

ISD identified the network as having traits of Operation Overload, a pro-Russian information operation whose primary objective appears to be overwhelming international fact checkers and media institutions that monitor disinformation. The operation was active in high-profile events throughout 2024, including the French legislative elections, the Paris Olympics and the US presidential election. Behaviours shared by this network and Operation Overload include:

  • Both impersonate content made by media organisations, research institutions, universities and law enforcement agencies to build credibility.
  • The network tags and mentions media organisations and researchers in its content. The main goal of this appears to be to overwhelm them with fact-checking requests, while secondary goals are expanding their reach and achieving greater credibility by making it seem like media organisations are already aware of this content.
  • Both the network and Operation Overload create posts containing videos and QR codes.
  • Both make use of AI to manipulate content. Frequently, genuine media reports are used but with their audio altered, or with subtitles added to spread false claims. Based on both manual analysis of the network’s content and the use of specialised software, ISD believes the audio in many of the posts has been AI-generated. For example, the sound at some point becomes more robotic, usually when the speaker is not in the frame anymore and content is overlapped by stock video footage. Operation Overload has also been reported to use AI-generated content in its previous disinformation campaigns.

The network produced content primarily in English with some videos in French, Arabic, Spanish and Japanese. However, none of the posts were in German; this suggests that it is targeting foreign audiences or non-German speaking audiences in the country. The recent pivot of this network to focus on the German elections appears intended to undermine trust in their integrity at home and abroad.

Figure 3 (Left): A post with a manipulated video pretending to be from Sky News alleging that 61 percent of Germans are disappointed with their government “souring relations” with Russia. 
Figure 4 (Right): Another manipulated video falsely claiming MI6, the UK’s foreign intelligence agency, warned against traveling to Germany around the election.

Reach and Engagement

The network struggled to gain meaningful organic traction on X. In the analysis from 1 Nov 2024 to 30 Jan 2025, all 10,597 shares originated from a coordinated amplifier network, with no evidence of organic virality. While a small number of replies and shares may have come from genuine users, they were vastly outnumbered by inauthentic engagement.

Multiple indicators suggest bot-driven amplification, but the clearest evidence is the synchronised engagement pattern: every repost was made within a single minute. For instance, a post on January 27, falsely claiming that explosive devices had been planted near polling stations in Germany, received all 202 of its reposts at exactly 6:33 pm CET, six minutes after the post’s original publication.

While engagement was confined to the immediate network and its overall impact is therefore limited, the operation’s content likely reached some authentic users on X. The 48 identified accounts collectively received 2.5 million views, averaging 52,488 views per post.

Nevertheless, there was little organic engagement among X users. This could be due to factors including media literacy among those exposed, platform intervention limiting the visibility of suspicious content or the content’s overall poor quality; many posts were unconvincing, bizarre or confusing. The messaging was also poorly tailored for a German audience. None of the posts were in German, yet targeted politicians such as Michael Piazolo, Armin Laschet, and Andreas Scheuer, who are relatively unknown outside Germany or German-speaking countries, based on Google Trends data.

Implications and Conclusions

While the campaign achieved limited reach, it is significant for several reasons. Firstly, it has grown substantially in recent weeks. While posts averaged 23,689 views in November and December 2024, this figure nearly tripled to 65,578 views per post in January 2025. ISD assesses this increase to be primarily the result of inorganic methods.

Secondly, the operation has generated a flurry of new content in recent weeks. On 27 January, it reached its highest volume day with 10 unique videos, followed by six more videos added on 30 January, collectively receiving 1.3 million views. While none of this content seems to have penetrated the wider public so far, amplification from influential accounts could help the network achieve this. With continued action from the network, virality is certainly possible, especially as the election draws closer.

Third, the operation’s impersonation of real media, academic institutions and public figures has real-world implications. Examples include an apparent use of deepfake technology to alter a Thanksgiving message from the president of the University of Virginia into an endorsement of the AfD, as well as the doctored announcement from a UK police department mentioned above. The targeting of these entities indicates a broader goal to the campaign: further undermining trust in media and institutions in Western countries. Misuse of their images and messages has reputational consequences for these individuals and organisations. Social media platforms bear responsibility for the proliferation of this content, much of which violates their own policies around inauthenticity and/or impersonation.

Operation Overload is known to leverage AI-generated audio to fabricate and manipulate content. Many of these campaigns take the approach of using new audio technologies to generate scalable, highly-realistic false content as a cheaper alternative to producing believable deepfake videos. This makes it harder to detect and counter, allowing coordinated inauthentic networks to boost their reach. As these tools become more sophisticated, they lower the barrier for influence operations. They also risk degrading trust in information at critical moments for national security and public safety.

Many of these campaigns use new audio technologies to generate scalable, highly realistic false content as a cheaper alternative to producing believable deepfake videos. Operation Overload, for one, has been known to leverage AI-generated audio to fabricate and manipulate content. This makes the original content harder to detect and counter, allowing coordinated inauthentic networks to boost their reach. As these tools become more sophisticated, they lower the barrier for influence operations. They also risk degrading trust in information at critical moments for national security and public safety. 

Last, the operation – likely led by a hostile foreign actor – employs a set of tactics designed to manipulate public opinion around the German elections. These include spreading fear about election security, undermining trust in institutions, and amplifying misleading content via CIB. While its impact remains limited, these tactics highlight ongoing risks to democratic processes.