16 August 2023
By: Julia Smirnova and Francesca Visser
Despite Meta labelling the Wagner Group as a “dangerous organisation”, the company has failed to remove posts providing recruitment information and celebrating its brutal activities in Ukraine and beyond.
Content warning: graphic descriptions of violence
In late June, a short-lived mutiny by the Russian paramilitary Wagner Group galvanised global attention to the organisation and its leader Yevgeny Prigozhin. The mercenary group, which has been active since 2014, is notorious for its brutality and human rights violations around the world. These crimes include the murder, rape and torture of civilians in Central African Republic, Mali, Syria and Ukraine.
In recent years, the Wagner group has showcased sadistic violence in an effort to present itself as more ruthless than traditional military units. This public image appears to have attracted the attention of people particularly interested in extremely violent online content.
Previous research conducted by Logically in May of this year revealed that the Wagner Group was actively recruiting individuals using social media platforms, including Facebook and Twitter. Remarkably, this recruitment was happening despite Wagner being designated as a “dangerous organisation” by Meta and being prohibited from having presence on the company’s platforms as a result.
According to Meta’s policies (see Appendix for more details), content that praises, shows substantive support (including fundraising and recruiting) for, or represents a “Tier 1 dangerous organisation” should be removed from its platforms. These are organisations which, by Meta’s own definition “engage in serious offline harm – including organising or advocating for violence against civilians, repeatedly dehumanising or advocating for harm against people based on protected characteristics or engaging in systematic criminal operations”. The same applies to “leaders, founders and prominent members” of dangerous organisations.
Meta has been criticised for its lack of transparency on “dangerous organisations”. The full list has never been made public by the company, despite calls from the Oversight Board. However, by ISD’s assessment, the Wagner Group fulfils the criteria of a “Tier 1″ entity – particularly on account of its involvement in the murder and torture of civilians . There is also evidence that several leaders and founding members of Wagner belong to neo-Nazi groups and organisations or hold ultra-nationalist views. On that basis, content supporting and glorifying the group should not be present on Meta platforms, let alone being recommended to users.
While the future of Wagner after the failed mutiny remains unclear, the mercenary group was not disbanded, and its leader Yevgeny Prigozhin has continued to seek public attention. Most recently, a channel associated with Wagner in the Central African Republic (CAR) published an audio message from Prigozhin about the coup in Niger. In the message he called the coup part of the struggle against Western ‘colonisers’ and praised the Wagner Group as a force that can bring ‘order’ to African countries and fight ‘terrorists’.
On the same day, Prigozhin reportedly participated in the Russia-Africa summit in St Petersburg, meeting representatives of African nations. Remarkably, a picture of Prigozhin in St Petersburg was initially published on Facebook by an account allegedly belonging to Dimitri Sytii, one of the top leaders of the Wagner Group in CAR. Sytii, who was recently sanctioned by the UK, was sanctioned by the US in 2020, and is associated with a “dangerous organisation”, should not be allowed to have a Facebook account under Meta’s own policies.
In light of the recent failed mutiny and the ensuing uncertainty surrounding Wagner’s future, ISD set out to determine whether content praising the group or recruiting for it still exists on Meta platforms, which include Facebook, Instagram, and Threads.
This study identified a total of 114 accounts on Facebook and Instagram that were either impersonating or glorifying Wagner or posting recruitment content for the group. This is despite Meta’s designation of the group as a ‘dangerous organisation’, which in theory means it cannot have a presence on their platforms.
Within this total set of 114 accounts, further analysis by ISD found that:
- These accounts post in at least 13 languages: English, French, German, Italian, Spanish, Portuguese, Arabic, Macedonian, Polish, Romanian, Indonesian, Vietnamese, and Russian.
- Of the total 57 accounts on Facebook (including 26 pages, 25 profiles and six groups) posing as and/or glorifying the Wagner group and its activities:
- 15 had more than 10,000 followers or members at the time of writing. These pages and groups were focused not just on Wagner, but on pro-Kremlin content in general, with posts in French, Arabic, and Macedonian.
- 23 of them had posted recruitment material and/or information about how to join the Wagner group.
- A total of 57 accounts on Instagram were found to be impersonating and/or glorifying the group. Combined, these accounts had a followership of 10,175 users. Three of them spread information on how to join the group or posted recruitment material.
Meta’s policies clearly state that violent and graphic content should only be allowed on its platforms in cases where the imagery is used to ‘condemn and raise awareness’. However, much of the content posted to Facebook and Instagram that glorified Wagner featured videos and photos of violent acts committed by soldiers fighting with the group. At times this included depictions of violent murder and/or human corpses.
Additionally, during ISD’s investigations into Wagner content on Facebook (which used a research account that did not actively engage with any posts), the platform’s recommendation systems suggested our account join a pro-Wagner group. This was delivered both as a direct notification and as a recommendation under the ‘suggested groups’ section. This means that not only is Meta failing to detect content supporting the Wagner group on its platforms, but its algorithms may actually be automatically amplifying this content to users.
This research employed a combination of qualitative and quantitative techniques. ISD used Method52, a social media analysis tool developed by CASM and the University of Sussex, to analyse data from Facebook and Instagram that was collected through the CrowdTangle API. An initial screening exercise aimed to identify groups, pages, and accounts that both contained the word “Wagner” (spelled in either Latin or Cyrillic alphabet) in their names and had also posted content using Wagner-related keywords. In addition, analysts used phone numbers, a website, and handles of Telegram bots found in Wagner recruitment posters and affiliated channels to locate content providing information on how to join the group.
All data subsequently underwent manual assessment and analysis, and supplementary manual searches were conducted to pinpoint content that glorified Wagner.
Wagner Group Recruitment Content
ISD identified 23 Facebook pages, profiles and groups that have published recruitment posters and videos from the Wagner Group or contact information, including phone numbers, Telegram channels specifically used for recruitment and its website. These accounts span six languages, including French, Arabic, Russian, Spanish, Romanian and Macedonian.
Several accounts posted a short video calling for men to join “the strongest private army in the world”; it features Prigozhin personally who claimed that the “Third World War” was near.
However, the group appears to be conscious of its external reputation. One of these 23 accounts shared an 11-minute video in which a Wagner fighter attempted to persuade viewers that joining the group wouldn’t lead to being mere cannon fodder, but rather, individuals would receive military education and be welcomed into a Wagner “family.”
Recruitment content found on the platform was mostly published before the mutiny and contained seemingly authentic contact details, such as phone numbers, Telegram bots and channels or links to the Wagner official website. However, ISD could not find any credible evidence linking these Facebook pages, groups, and profiles directly to the group. Though these posts appear to have limited engagement – typically between zero and just over one hundred likes – ISD also found individuals with significant audiences posting pro-Wagner content.
An example of this is Luc Michel, a Belgian far-right and pro-Kremlin activist, who was publicly exposed earlier this year as being behind a pro-Kremlin and pro-Wagner campaign targeting African countries. Despite being exposed, Michel continued to publish content praising and recruiting for Wagner to an audience of more than 71,000 Facebook followers.
Two other popular French-language pages that published recruitment information – one with over 143,000 followers and another one with over 34,000 followers – were also targeting two countries in Africa: Mali and Côte d’Ivoire.
In addition to this, ISD identified a profile that posted a link to a pro-Wagner Telegram bot (an automated programme that can send and receive messages and show users different commands) in two groups targeting Ukrainians. The identical posts called for “sharing information with our friends” with a Telegram bot. The bot claims to belong to “Wagner South” and offers two options: to join Wagner or to share information about locations of “Ukrainian Nazis” (a pro-Kremlin codeword for Ukrainian forces) and “accomplices of Nazis” (a codeword for any pro-Ukrainian person) on the territories occupied by Russia.
While all the identified posts with information on how to join Wagner were published on Facebook prior to the mutiny, ISD noticed that individual users became particularly interested in the group after the news about the insurrection broke out. ISD identified numerous posts and comments in which users appeared to be sincerely trying to get information about joining the mercenary group.
On Instagram, our investigation uncovered three accounts posting content containing information and contacts related to joining the Wagner group. One of these accounts featured a recruitment advert showcasing a Wagner soldier in Bakhmut. In the video, the soldier reveals that he had no prior military experience and had not served in the army before joining the group. He passionately describes the group as a “big family” and assures prospective recruits that they will receive thorough training and a warm welcome upon joining.
The video concludes with a list of phone numbers and a website where interested individuals can sign up to join the group. The other two accounts, which post in English, also offer assistance in joining the group, providing different phone numbers and separate contact accounts for those interested in enlisting.
Glorification of Wagner Group Activities
On Facebook, ISD identified 54 pages, groups and individual profiles that either posed as Wagner or glorified the group and its mercenaries. The content posted by these pages, groups and profiles was produced in at least 13 languages: English, French, German, Italian, Spanish, Portuguese, Arabic, Macedonian, Polish, Romanian, Indonesian, Vietnamese, and Russian.
As with accounts posting recruitment content, ISD could not find any credible evidence for affiliation of these pages, groups and profiles with Wagner, however they did have sizeable audiences; 15 of pages and groups had more than 10,000 followers or members. Those pages did not focus exclusively on pro-Wagner content but were typically pro-Kremlin. The most popular page in the data set was one supporting Koudou Laurent Gbagbo, former President of Côte d’Ivoire, with an audience of more than 313,000 followers. This page posted a video of a staged scene in which a Wagner fighter supposedly ‘escapes kidnapping by terrorists’ and praises the soldier’s bravery – it received almost 40,000 reactions and over 7,500 shares.
The identified pages, profiles and groups published numerous posts glorifying the Wagner Group both prior to the mutiny and after it – this means there has been a consistent presence of pro-Wagner content on Facebook for some time.
Posts from these accounts included videos, many of which were previously published by Russian state media or Russian-language pro-Wagner or pro-Kremlin Telegram channels and contained their watermarks. The videos typically show Wagner mercenaries fighting in Ukraine, with battle scenes sometimes accompanied by pro-Wagner songs. One of the videos shared by several pages features a Wagner fighter raising the Russian flag in the ruins of the Ukrainian town of Bakhmut and shouting obscenities about Ukrainians. The months-long battle of Bakhmut earlier this year has come to be known as one of Europe’s bloodiest infantry battles since the Second World War. Russian forces stand accused of war crimes in the town, including the use of white phosphorous in civilian areas.
Another video shared by several pages was a song praising Wagner in Arabic with an Iraqi accent and with Arabic and Russian subtitles. The song was initially published by Sabreen News, an outlet linked to pro-Iranian militias in Iraq that regularly echoes Kremlin propaganda lines. The song glorifies Wagner fighters as heroes and calls them “Shrougi” – thus framing them as belonging to Shia from Southern Iraq. The song was accompanied by war footage and pictures showing the Wagner group’s founder Yevgeny Prigozhin together with Vladimir Putin.
Other content glorifying Wagner included numerous pictures of mercenaries posing with weapons, often accompanied by heart emojis, emojis of musical instruments (an allusion to the Wagner Group being unofficially called “orchestra” or “musicians”) and comments calling Wagner “heroes” or “most feared Russian warriors”. Several pictures included large and clear watermarks of pro-Wagner Telegram channels, thus making it easy for Facebook users to find them by their handles.
Several pages and groups posted content openly glorifying violence and killings committed by the Wagner Group, as well as by other units of the Russian army. In one instance, a graphic video of a killing of a Ukrainian soldier, allegedly at the hands of Wagner Group soldiers, was published by one of the pages. Another video shows a Wagner fighter posing with dozens of dead bodies, allegedly of Ukrainians killed by the mercenaries in Bakhmut. Yet another video showed alleged Wagner fighters “sentencing” a Ukrainian to torture and death. Other content included videos and pictures celebrating the destruction of Ukrainian cities and towns.
Facebook recommends a pro-Wagner group
After ISD reviewed several pro-Wagner pages and groups, Facebook’s recommendation systems suggested an additional pro-Wagner group to the research profile, both via notifications and the “Discover” section, where it lists groups that the user might be interested in. The profile did not engage with any pro-Wagner content during the course of this study, instead simply passively watching videos and searching for Wagner-related keywords. As of 25 July, the recommended group has 1,320 members who publish content praising Wagner and ask how to join the mercenary group.
One of the users in the recommended Facebook group posted a link to a closed WhatsApp chat, where users share Wagner’s contact details, pro-Wagner symbols and in one case a swastika.
ISD identified three Facebook pages with a combined followership of more than 136,000 that regularly post content glorifying Wagner in Arabic. The pages are linked to a website, a Twitter account and a Telegram channel, all of which continually report on the war in Ukraine as told by Russian sources.
These three Facebook pages began showing support for the Wagner mercenary group in December 2022. A post of a Wagner concert featuring a hammer-wielding orchestra member banging on a drum was used to state: “not many will understand, but many will remember this post later.” The post was meant as a wink and a nod toward a gruesome execution video at the hands of Wagner mercenaries in Ukraine. The page then claimed that Wagner was supportive of their efforts in January 2023.
Altogether, the pages have produced at least 185 posts — primarily videos — venerating or supporting the Wagner mercenary group since the start of the Russian invasion of Ukraine. The most viewed post, a video with over 712,000 views, shows Wagner mercenary forces testing tank traps set up at a defensive line in Ukraine. These pages were also found to be luring Facebook users to Telegram with links to graphic content showing dead bodies.
Pages and groups that have published content glorifying Wagner usually publish other content supporting the Russian invasion of Ukraine. These users often repeat Russian propaganda lines about “denazification” of Ukraine as an alleged aim of the war.
Content glorifying Wagner was not, however, limited to their presence in Ukraine but also included pictures of Wagner mercenaries in Africa and the Middle East. After the mutiny, one of the French Facebook pages included in the study published an ad for Wagner offering their “services” in Africa and claiming to protect people in Africa from “militants and terrorists”.
Among the Facebook pages sharing content that praised Wagner, ISD also noticed a cluster of 13 pages posting in Macedonian. These pages have a collective followership of more than 185,000 and frequently repost each other’s content. The posts from this small network included similar pro-Kremlin and pro-Wagner group seen in other languages and were being reposted into Macedonian Facebook groups.
On Instagram, our searches identified 57 accounts posing as Wagner or glorifying the group, with only four of these having been restricted by the platform in the past weeks. Of these, 48 accounts include the word “Wagner” in their account name, and 30 feature the Wagner logo as their profile picture. Most of these accounts have a relatively modest following, with only four of them surpassing 1,000 followers. However, when combining the followership of all 57 accounts, they reach a total of 10,175 followers.
While data provided by Instagram does not provide information on account creation dates, ISD recorded the date of the first post published by each account as an indicator of their activity start date. This analysis was possible for 29 accounts that were not private, restricted, and had posted at least one piece of content. 24 of these 29 accounts were created after the full-scale invasion of Ukraine in February 2022, and 14 accounts were established between June and July 2023, immediately following the Wagner mutiny. This illustrates how growing interest in the Wagner group has resulted in a surge of new accounts glorifying their activities on Instagram.
The content posted by the analysed accounts includes glorification of the group and graphic combat footage featuring Wagner soldiers in Ukraine, Central African Republic, and Syria. The videos and images posted by these accounts also involve heavy symbolism and glorification of violence. Particularly noteworthy are several videos and images featuring the logo of an angel with a sledgehammer destroying the trident found on the coat of arms of Ukraine.
This sledgehammer symbol holds historical significance for the Wagner group, dating back to the Syrian civil war when a member of the Wagner group brutally murdered a Syrian Army deserter using this kind of tool. Subsequently, the sledgehammer became a symbol for the group and used in merchandise and social media content both by Wagner members and their supporters.
During the war in Ukraine, the sledgehammer once again became an instrument of punishment and was used in two murders of former Wagner mercenaries who had allegedly changed sides or surrendered to Ukraine. Again, these murders were recorded and the videos were subsequently shared across social media channels.
Much like Facebook, content glorifying the group on Instagram frequently elicited responses from other users expressing their strong interest in joining. Many of these users sought additional information on how to become part of the group.
Despite previous reports drawing attention to the existence of pro-Wagner recruiting content on Meta platforms in violation of the platform’s Community Standards, support for the mercenary group persists on Facebook and Instagram. ISD has identified content that glorifies the Wagner group, celebrates its brutal activities in multiple countries, and amplifies its previous recruitment endeavours. This content clearly contradicts Meta’s own policy concerning dangerous organisations and possibly the company’s policies on violent and graphic content as well. Yet, it has been disseminated to an audience of several hundred thousand users. Facebook’s own recommendation systems even suggested a pro-Wagner group to the research profile used for this study.
Social media platforms must take decisive action to enforce their own policies consistently. Particularly during and after events that bring a certain group into the spotlight, platforms should actively enhance their monitoring efforts to prevent recruitment activities.
Research often shows that moderation practices on social media platforms are lacking when it comes to content in languages other than English. In this case, ISD found that terrorist content targeting East Africa was present on Facebook in Somali, Kiswahili and Arabic and that these languages were not moderated effectively; that Arabic-language Facebook groups were used to buy and sell weapons or that misinformation about Covid-19 was widespread across Arabic-language Facebook. Platforms should invest more efforts in moderation in languages other than English, as our findings showed that pro-Wagner content was spread by French, Arabic and Macedonian Facebook pages. Some of these pages have audiences of over 10,000 users, suggesting that moderation in these languages is less effective.
Regular updates to the keyword lists associated with designated dangerous organisations are crucial to stay current with slang, codewords, and unique contact information they might employ. Using these keywords, ISD was able to identify Wagner recruiting content even with limited data access. Given Meta’s own technical capabilities, this approach should improve the quality of moderation.
Meta’s policies on “dangerous organisations” consulted for this research
Meta’s policies against “dangerous organisations” state: “We do not allow organisations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook. We assess these entities based on their behaviour both online and offline – most significantly, their ties to violence.”
Meta divides these entities into three tiers:
|Definition of tier
|“Tier 1 focuses on entities that engage in serious offline harm – including organising or advocating for violence against civilians, repeatedly dehumanising or advocating for harm against people based on protected characteristics, or engaging in systematic criminal operations.”
|“We remove praise, substantive support and representation of Tier 1 entities, as well as their leaders, founders or prominent members.”
|“Tier 2 focuses on entities that engage in violence against state or military actors, but do not generally target civilians – what we call “violent non-state actors”.”
|“We remove all substantive support and representation of these entities, their leaders and their prominent members. We remove any praise of these groups’ violent activities.”
|“Tier 3 focuses on entities that may repeatedly engage in violations of our Hate Speech or Dangerous Organisations Policies on or off the platform, or demonstrate strong intent to engage in offline violence in the near future, but have not necessarily engaged in violence to date or advocated for violence against others based on their protected characteristics.”
|“Tier 3 entities may not have a presence, or coordinate on our platforms.”
Moreover, Meta pledges to remove ambiguous content: “We recognise that users may share content that includes references to designated dangerous organisations and individuals to report on, condemn or neutrally discuss them or their activities. Our policies are designed to allow room for these types of discussions while simultaneously limiting risks of potential offline harm. We thus require people to clearly indicate their intent when creating or sharing such content. If a user’s intention is ambiguous or unclear, we default to removing content.”
Meta defines praise, substantive support and representation as following:
 The designation of Wagner as a dangerous organisation was confirmed by a Meta spokesperson to Politico at the end of May. However, the spokesperson did not specify which tier is assigned to Wagner but did state that Meta removes “praise or substantive support for Wagner”, which would indicate Tier 1.