Cashing in on conflict: TikTok profits from pro-Kremlin disinformation Ads
24 August 2023
By: Francesca Visser, Julia Smirnova, and Clara Martiny
New research by ISD reveals TikTok has been profiting from ads that feature pro-Kremlin disinformation and conspiratorial narratives. Ads found via TikTok’s ‘Commercial Content Library’ also included content discrediting Ukraine and Ukrainians while glorifying the Russian invasion, and content that depicted the brutal mercenary Wagner group in a positive light.
In late July 2023, TikTok released an updated version of its Ad Library, offering insights into campaigns that are using the platform to target European audiences. To be eligible for placement on TikTok, the ads must adhere to “all applicable laws, rules and regulations that apply to the targeted regions” as well as to the platform’s Terms of Services, Community Guidelines, and the policies outlined in the Business Help Center. This includes policies that prevent extremist content, including promotion of violent and hateful actors; hate speech; and disinformation from circulating on the platform.
However, previous research from ISD and others has shown that TikTok consistently falls short in effectively enforcing these policies against content uploaded by its users related to the war in Ukraine. We set out to examine if the same is true for ads.
Key findings:
- ISD identified a total of 32 Ukraine-related ads published by 19 unique accounts that appear to violate TikTok policies. All of these were posted in the last ten months and garnered an estimated 104,000 to 108,000 views from users in five European countries: Germany, Spain, France, Italy, and the UK.
- After the full-scale invasion of Ukraine in February 2022, TikTok banned Kremlin-controlled media outlets from being accessed by EU users. However, ISD identified three ads featuring content from Russian state media, which obtained a total of 13k views. Since the ads were not posted by official state media accounts, they did not receive a ‘state-affiliated media’ label. This loophole allows Kremlin propaganda to circulate on TikTok without users being informed about the bias of these sources.
- The ads identified included false claims or conspiratorial narratives, content discrediting Ukraine and denying its right to exist, glorifications of the invasion of Ukraine and Russian actions during the war, and positive depictions of the Wagner Group.
Methodology
ISD analysts set out to examine whether potentially problematic ads about the war in Ukraine are being targeted at European TikTok users. To do so, we used the TikTok Ad Library to find relevant ads by conducting keyword searches in four languages: English, German, Spanish, and French. These searches included ‘Putin’, ‘Russia’, ‘Ukraine’, ‘Zelensky’ and ‘Wagner’ (and various alternative spellings of these terms). The data collection period and drafting of this study had concluded by the time that the death of the Wagner Group’s leader, Yevgeny Prigozhin, was announced on August 23.
Relevant ads returned by these searches were then coded according to the following categories:
- Pro-Kremlin talking points including false or conspiratorial narratives;
- Content discrediting Ukraine or Ukrainians;
- Content celebrating or glorifying the war against Ukraine and featuring pro-war symbols;
- Promotion of the Wagner group.
The table below provides highlights from TikTok’s Community Guidelines that are particularly applicable to each of the categories of content identified by ISD during the investigation.
Category | Pro-Kremlin talking points including false or conspiratorial narratives | Content discrediting Ukraine or Ukrainians | Content celebrating the war and featuring pro-war symbols | Promotion of the Wagner group |
Relevant Policy | Misinformation Policy
“We do not allow inaccurate, misleading, or false content that may cause significant harm to individuals or society, regardless of intent. Significant harm includes physical, psychological, or societal harm, and property damage.” |
Hate Speech and Hateful Behaviours
“We do not allow any hateful behavior, hate speech, or promotion of hateful ideologies. This includes content that attacks a person or group because of protected attributes, including: caste, ethnicity, national origin, race[…]” Content that is not allowed on the platform includes content that is “Demeaning someone on the basis of their protected attributes by saying or implying they are physically, mentally, or morally inferior, or calling them degrading terms, such as criminals, animals, and inanimate objects” |
Content glorifying the war in Ukraine or including pro-war symbols is not covered by TikTok’s existing Community Guidelines. However, it is stated that TikTok Ads must comply with “all applicable laws, rules and regulations that apply to the targeted regions”.
Glorification of war crimes, including display of the ‘Z’ symbol is regarded in Germany as an endorsement of aggression and war crimes and can therefore be prosecuted under Article 140 of the German Criminal Code. of the German Criminal Code. |
Violent and Hateful Organizations and Individuals
“We do not allow anyone to promote or materially support violent or hateful actors. Content that may appear neutral, such as referencing a quote from a hateful organization, must make clear that there is no intent to promote it. We make limited allowances for people to discuss violent political organizations, but only if: (1) their causes are recognized as legitimate under international legal frameworks, (2) they do not primarily target civilians, and (3) the content does not mention violence.”
While it is not clear whether the Wagner group is classified as a “hateful organization” by TikTok, much of the content supporting the group would likely meet the threshold for “violent and hateful”. |
Pro-Kremlin false or conspiratorial claims
ISD analysts identified 14 ads containing false or conspiratorial claims about the conflict in Ukraine, which received a combined total of 62k views. Content in this category typically shifted the responsibility for the war to the US, or the West in general.
Some of these ads featured content from Kremlin media outlets, despite TikTok’s announcement that they would block access to RT and Sputnik for EU users following the full-scale invasion of Ukraine almost 18 months ago. ISD analysts found a clip from an RT programme containing conspiratorial claims being used in ads mentioning Ukraine, suggesting that this policy is not being enforced on all forms of content.
The ad features a news segment from Sepa Más, an outlet affiliated with RT en Español. The text shown in the ad (below) refers to the Ukrainian government as a “brutal inhumane dictatorship allied with the Satanists and Islamists” and further alleges that “NATO is persecuting the Church because it does not adhere to its monstrous ideology.”
A second ad features content from another of RT’s Spanish-language programmes, Ahí les Va, with the channel’s logo prominently displayed in the top right corner of the video (see below). The clip showcases an interview with Ukrainian journalist Oleg Yasinsky, wherein he claims that Ukraine has been chosen “as a laboratory to destabilize Russia.”
In the video, Yasinsky asserts that the Ukrainian people have fallen victim to “highly professional media and ideological manipulation.” He goes on to label the Euromaidan protests of 2013-2014 a “coup d’état” resulting in the rise of “Western puppets” aided by “Ukrainian fascist forces.” The text accompanying the video reads, “Listen to this Ukrainian journalist. The US and NATO don’t want you to hear what he says.” The claims made by Yasinsky in this video are common among disinformers and conspiracy theorists when discussing the conflict in Ukraine.
Russian state media was not the sole source of clips like these used in ads with conspiratorial claims. Others drew on interviews with prominent voices who often propagate pro-Kremlin talking points. In one clip, Lara Logan claims that the CIA funded the 2013 and 2014 protests in Ukraine and then “selected” Ukrainian leaders. In another, Stew Peters argues that Ukraine is a “fake country” and that Zelensky was “installed”.
Two other videos in this category blamed the US for the attack on the Nord Stream pipelines, while another accused the US of being solely responsible for the collapse of the USSR and of attempting to destroy Ukraine.
Three of the ads featuring pro-Kremlin false claims also included other false claims that were not connected to the war in Ukraine, including conspiracy theories about vaccines, Hunter Biden, and NATO.
In one, Stew Peters (mentioned above) claims that COVID-19 vaccines are a “weapon of mass destruction”, that anyone involved in the implementation of COVID-19 vaccination policies should be held accountable, and that “the death penalty should be on the table”. This ad reached 4,000 users in the UK, despite TikTok’s Community Guidelines clearly stating that the company does not allow “paid advertising that advocates against vaccinations”. This example suggests that TikTok is falling short on applying its community guidelines on ads across multiple topics.
Content discrediting Ukraine or Ukrainians
ISD analysts found seven ads discrediting Ukraine as a nation or Ukrainians as a population, which received between 15,000 and 17,000 views. These videos appear intended to ridicule Ukrainian leadership and the military or undermine the credibility of Ukrainian refugees in Europe.
One Italian-language video compiled various headlines about Ukraine and Zelensky being corrupt and claimed that while “kids” were “sent to suicide”, the “U(SA)kraine government” was “selling its soul.” The video ends with a call for Italian-Russian solidarity and was seen by 3,000 users. An ad in English, called Ukraine a “fake country”, thus repeating the common Kremlin narrative of denying Ukraine the right to independence and sovereignty.
Another English-language video depicts a person who can be heard saying “are extremely disrespectful to ethnic minorities like myself”. It is not clear from the short audio clip who the person is referring to, however, the accompanying text in the video implies the comment pertains to Ukrainian refugees. The video concludes with the text “NO NAZIS”, perpetuating a common disinformation narrative that wrongly portrays Ukrainian refugees as Nazis.
Content celebrating the war and featuring pro-war symbols
A total of ten ads, collectively receiving at least 24k views, were found to include content that glorified the invasion of Ukraine, the ongoing war, or Russian military victories. This content mostly originated from pro-war events held in Russia, particularly after the “annexation” of Donetsk, Luhansk, Kherson and Zaporizhzhia. These ads also included pro-war symbols like the letter “Z” and slogans such as “Za Rossiyu” (For Russia) or “Za mir bes nazisma” (For a world without nazism).
In the aftermath of the invasion of Ukraine, the public display of the letter ‘Z’ was banned by a number of countries in Europe, including Latvia, Lithuania, and Germany. In Germany, the display of ‘Z’ when used in the context of the war in Ukraine is deemed punishable as it represents the “approval of illegal activities” and the breach of international law. There are several precedents of people being fined or arrested for showing the symbol on cars or t-shirts or posting it on social media.
All ten ads in this category targeted German audiences with content containing the ‘Z’ symbol. Despite TikTok’s statement that its advertisers must comply with “all applicable laws, rules and regulations” in the countries of their intended audiences, there does not appear to be any enforcement against these videos.
Two other ads, published by the same account, feature the patriotic Russian singer Yaroslav Dronov (known as Shaman) during a performance of his song Vstanem (‘Let’s Rise’). This song was dedicated to Russian veterans and was performed on stage in Moscow on the anniversary of the full-scale invasion of Ukraine.
Promotion of Wagner Group content
Between 18 and 22 April 2023 two identical videos supporting the brutal Kremlin-affiliated Wagner Group were posted by the same TikTok account, shortly after Orthodox Easter. These videos appear to celebrate the Wagner Group and its leader Yevgeny Prigozhin, in particular, for the decision to release Ukrainian prisoners of war. The video was originally posted on Telegram by Prigozhin’s press service in an apparent attempt to launder the group’s public image.
In the TikTok ad, the text (originally in French) reads: “Wagner boss Mr. Prigozhin has freed Ukrainian soldiers on Easter Eve to let them rejoin their families. Go back to your families and don’t come back.” The video features Ukrainian soldiers walking away as a Wagner soldier, who is filming, wishes them a “Happy Easter,” good luck, and good health. According to the information provided via the TikTok Ad Library, the two ads were viewed by between 6,000-7,000 unique users.
The compassionate depiction of the mercenary group stands in stark contrast to the harsh reality of Wagner’s actions, not only in Ukraine but also in other countries where the group has been active since its establishment in 2014. Over the course of almost a decade, the group has been responsible for various atrocities, including documented instances of videotaped murders and torture inflicted on members of warring factions and even on former members within its own ranks.
TikTok’s Community Guidelines are clear that users are not permitted to “to promote or materially support violent or hateful actors” and that this type of activity may result in an account ban. However, in this case not only has TikTok failed to prevent the monetisation of such content but has itself profited, albeit to an undisclosed degree. Despite ample evidence documenting the group’s crimes, content glorifying the Wagner group continues to spread unchecked across various social media platforms.
Repeat offending
In addition to reviewing TikTok Ads, ISD analysts also examined past content posted by the identified accounts. This analysis aimed to determine whether any of these accounts had previously shared content that violated TikTok’s policies.
In total, ISD identified 18 unique accounts (responsible for 31 Ukraine-related ads) which had previously featured problematic content on their profiles. Among these, 10 accounts had posted pro-Kremlin content, which included false claims and conspiratorial material. Five of the ten had posted content that glorified the war in Ukraine, and six accounts had disseminated content aimed at discrediting Ukrainians. A further ten profiles had shared false claims or conspiracy theories on topics other than Ukraine, such as content denying the existence of global warming and conspiratorial content about LGBTQ+ rights. These findings underscore the shortcomings in moderation efforts, revealing issues not only with ads but also with other forms of content on the platform.
Conclusion
Social media companies providing a greater level of transparency around the content that exists on their platforms is always a step in the right direction. Ensuring public interest researchers have meaningful data access to social media data is also key for information gathering and understanding the impact on users and society, and platform accountability.
The TikTok Ad Library provides researchers with a baseline understanding of what kind of ads TikTok users are being targeted with and how popular these are (especially with the unique view count). Throughout this research, ISD analysts noted that some ads under the keywords were removed by TikTok due to a “violation of TikTok’s terms” (although it is unclear which term triggered the response), which suggests that TikTok is attempting to moderate its ads to some extent.
However, as the image below shows, the enforcement of these policies often came too late, allowing ads featuring pro-Kremlin false claims to garner hundreds of thousands of views before being removed. The approach taken to removing and limiting ads was also found to be inconsistent. For example, ads that contained clips from Russian state media outlets such as Sputnik and RT have been accessible on the platform since January (despite an EU-wide ban), while other problematic ads were removed as recently as early July.
ISD also identified some gaps in the TikTok Ad Library’s functionality. Crucially, there is no way to tell how much an advertiser spent on an ad, and therefore how much TikTok is profiting from ads that may violate their community guidelines, such as those mentioned above. It is also unclear whether advertisers are refunded for ads that are removed for violating TikTok’s terms, or whether the platform retains these profits. ISD would welcome greater transparency on the monetisation of content across all social media platforms.
TikTok’s Ad Library currently provides data on ads in Europe and the UK which limits the potential for research to a small number of countries and languages. ISD would recommend that TikTok expand the library, particularly to countries with significant proportions of TikTok users, such as the US, Indonesia, Brazil, and Mexico. Additionally, there must be an expansion of resources to allow for the moderation of content in languages other than English to be just as consistent and comprehensive.
TikTok ads can be beneficial in helping creators reach out to broader audiences and grow their businesses or profiles, but promoted content needs to be comprehensively moderated to avoid boosting disinformation narratives, conspiratorial claims, and content that fuels animosity towards specific groups.