Commercial Disinformation
By Elise Thomas and Kevin D. Reyes
Commercial disinformation is a broad term which covers a wide variety of actors and activity promoting false, misleading or manipulated content presented as fact, that is intended to deceive or harm, motivated primarily by commercial gain rather than political or ideological conviction. Regardless of the motivation, this activity is damaging to the health of the information environment.
There are a multitude of business models that focus on promoting disinformation to generate a profit. This Explainer looks at two broad business models used for commercial disinformation:
- Disinformation as a service (DAAS)
- Disinformation as a product (DAAP)
The general difference between these two industries can be thought of as analogous to the difference between a marketing firm and a merchandise retailer.
Disinformation as a service
The actors behind disinformation as a service (DAAS) are often marketing, communications and public relations firms. These firms appear to offer DAAS indiscriminately to a wide range of paying customers. These can include governments, politicians and businesses. They are not usually linked to a particular state as is, for example, the Internet Research Agency, which has been found to deploy disinformation campaigns on behalf of the Russian government.
In this business model, commercial disinformation operators are paid by their client to operate social media campaigns that include seeding and/or amplifying false and misleading content. These campaigns tend to focus on elections, proposed laws, and other political issues and the firms can be hired to either promote or attack people, entities, and narratives. In addition to generating disinformation, their offered services can include coordinated and/or inauthentic behaviour (e.g. impersonation, harassment, and hashtag hijacking) as well as legitimate marketing, political communications, and social media management techniques.
Some firms rely on large networks of social media bots (i.e. fake accounts), while others leverage a series of websites, or both approaches, to extend the reach of their messaging. Like many legitimate agencies, success is dependent on key performance metrics established by the client, such as the number of posts, number of websites created, total followers or subscribers, or the interactions received from other accounts. This creates a different set of incentives for DAAS actors, in comparison to either non-commercial actors or DAAP actors. Their incentive is to fulfil their contract which may, or may not, align with generating a meaningful impact.
For example, scholars at the Stanford Internet Observatory have argued that “marketing firms might be able to meet certain quotas without having the impact that government clients think they’ve purchased.” They may, for example, aim for quantity over quality by focusing on the volume of their output (e.g. number of posts) instead of organic engagement that may actually affect political discourse. Clients may, of course, eventually catch on to this vulnerability and demand stronger key performance metrics such as actual changes in political discourse that can be measured by polling data.
As this is a “shadow industry”, the exact numbers of firms or operations are not fully known. Researchers at Oxford University, however, identified 48 countries from which dozens of private firms operated “computational propaganda on behalf of a political actor” in 2020 — a marked increase from just nine countries in 2017. Additionally, further scrutiny by journalists, researchers and online platforms continues to shed light on the range of actors and their operational tactics. Three of the most notable recent examples are presented below.
In February 2023 several media organisations, including The Guardian, reported on their investigation into ‘Team Jorge’, a group of Israeli contractors who offer to engage in covert and sometimes illegal political operations for a fee.
According to reporters, mass disinformation campaigns were a key offering from ‘Team Jorge’, among other unethical and/or illegal practices including account hacking and blackmail. The firm claimed to have a software program capable of controlling more than 30,000 fake accounts. Each fake persona had many personal details (e.g., date of birth, location, political beliefs, and photos) and was “multi-layered”. This means that they had multiple accounts on social media, email, and even e-commerce websites to give the impression that they belonged to a real person. The alleged leader of ‘Team Jorge’, Tal Hanan, told undercover journalists that these accounts could be tailored to the specific regions or demographics the client wished to impersonate such as, for example, young political left-wingers in the US or middle-aged conservatives in Latin America.
Hanan also claimed that ‘Team Jorge’ had worked on “33 presidential-level campaigns” globally. It is important to note, however, that the claims made by Hanan and his colleagues to undercover reporters about the extent and effectiveness of Team Jorge’s services were part of a pitch to secure a contract, and thus not possible to independently verify.
Predictvia, a company registered in Florida and operating in the US and Venezuela, was reported by Meta in May 2023 to have operated dozens of fake accounts on Facebook, Instagram and Twitter, along with fake websites intended to influence local and national politics in Guatemala and Honduras. Predictvia also had thousands of fake Twitter accounts at its disposal, according to reporting by Reuters.
The company’s website describes Predictvia as being “in the front line of the fight against misinformation”, and claims to use information generated by artificial intelligence to “represent the appearance of a human being”. Predictvia’s LinkedIn company profile states that the company uses “data services that boost your marketing efforts and market insights”. The investigation into Predictvia is considered a “rare exposé” — though not the first — of a DAAS firm based in the US.
An investigation published with Bellingcat in 2019 found that a covert campaign aimed at distorting the truth and whitewashing the Indonesian government’s actions in West Papua was linked back to InsightID, a Jakarta-based marketing company. While the audiences in this case were users in the UK, Europe and US, the same company was running other ad campaigns, including for clients in the K-pop music industry. This highlights that some firms walk both sides of the line, running both legitimate marketing operations and covert influence campaigns. The company was using various methods for tracking its metrics, including Google Analytics tracking codes, and had also spent thousands of dollars on Facebook advertising for its fake pages and websites.
The DAAS industry can be seen as a spin-off from the now-well-known practice of state-run information operations. Instead of states operating a disinformation campaign and incurring the associated risks directly, they can simply outsource this to DAAS firms who are also willing to work with politicians running for office and other non-state actors. While the client’s motive may be political, the firm’s motive is commercial. It should be noted, of course, that the marketing, communications, and public relations industries also regularly deploy online campaigns for their legitimate work.
Regardless of how effective their tactics might be, it is clear that there is a global demand for the kind of services disinformation-for-hire firms have to offer.
Disinformation as a product
The second prominent type of commercial disinformation involves actors generating disinformation to take advantage of controversial topics with high engagement and generate revenue from ads and merchandise sales. These actors include highly organised groups, loosely affiliated networks and individuals working alone, and the scale of their operations can range from a single social media account to dozens or even hundreds of accounts and websites. These actors embrace the supposed “cheap speech” in the online ‘marketplace of ideas’ by betting on whichever discourse has high engagement.
The operators of these accounts and websites work to secure high engagement from their audience as this directly determines their ad and sales revenues. They will produce content and sell related merchandise to whatever topic brings high profits, whether that’s promoting QAnon conspiracy theories or simply posting cute animal videos. Disinformation is not a goal in and of itself. The actors’ goal is simply to create engaging content; whether it is true or not is irrelevant. Accordingly, when they find a niche of highly engaged users, like QAnon conspiracy theorists or dedicated Trump fans, they will capitalise on that niche.
Some of these operations have been observed to have “pivoted” from more “wholesome” content (e.g. images of cute animals or clips from TV shows) to conspiracy theories and disinformation. They may also post these two types of content side-by-side. Accounts on social media are created or purchased and cultivated to attract large, organic audiences. After they have acquired a significant number of followers, these accounts may even be sold to others — including DAAS firms — looking to operate developed accounts and bypass having to grow them organically.
Many of these social media profiles and pages post content that will generate high engagement, such as content relating to anti-vaccine conspiracy theories or elections in the US. Posts eventually direct audiences to external content, such as YouTube videos or websites run by the same operators. At this point, the operators can make money off their audience by earning ad revenue on YouTube or on their websites.
YouTube videos and content posted on their websites are often plagiarised from other sources. In a study examining the effectiveness of European Union sanctions on RT as a response to the invasion of Ukraine in 2021, ISD researchers found dozens of websites plagiarising whole articles from RT which were rife with disinformation about Ukraine. All the websites hosted ads and one attracted approximately 113,000 visitors in one month. In this case, nonstate actors helped amplify Russian-backed disinformation by reposting it for ad revenue on their own news aggregator websites.
Other posts from these actors also direct their audience to purchase merchandise related to the conspiracy theories and disinformation that they post. The actors are either directly selling the product to the follower-turned-customer or are engaged in affiliate marketing. Affiliate marketing, according to e-commerce company Shopify, “involves referring a product or service by sharing it on a blog, social media platform, podcast, or website”, while “the affiliate earns a commission each time someone makes a purchase”.
ISD’s research, Conspiracy Clickbait, examined the business models of three operations based in Vietnam which used QAnon and other US conspiracy theory content to make money via YouTube ads, merchandise sales, and affiliate marketing. The operators registered domains and developed e-commerce websites which sold shirts and hoodies with pro-Trump, anti-vaccine, and QAnon messages. These listings were then shared to their audiences on YouTube and Telegram.
Other websites in our Conspiracy Clickbait series had affiliate links for products such as “Trump Revenge Coins.” In the days before and after the January 6, 2021, attack on the US Capitol, Trump coins were “soaring” in popularity on ClickBank, an Idaho-based e-commerce platform popular in affiliate marketing, according to the New York Times.
Similar reporting by NBC in May 2023 showed the real-world effects of these pro-Trump coins, bucks, checks, and cards. Many social media posts pushed financial conspiracy theories to sell these products using affiliate marketing links — sometimes with the help of AI-generated videos. As a result, unsuspecting Trump supporters have spent thousands of dollars on these products and some have even attempted, unsuccessfully, to convert them to US dollars at banks.
Other websites analysed in our Conspiracy Clickbait series also directly solicited donations from its audience.
Online clickbait and merchandise operations have evolved into a significant industry in many parts of the world. Although there are certainly some operators in wealthy countries, the economics of this industry mean these sorts of operations are largely based out of developing countries and are targeted at consumers in wealthy ones. It often provides a way for educated, tech savvy, and usually young people with limited job prospects to make money online and, depending on where they live, potentially even earn a very good income relative to the communities they live in.
Parts of Africa, Eastern Europe, South Asia, Southeast Asia, and Southeast Europe are emerging as particular geographic centres of disinformation as a product. A strong example of this industry being a driver of economic growth for these areas is the infamous ‘fake news’ content farms in Veles, Macedonia, which monetise largely via ad revenue and have provided locals with a relatively high income.
The motive for the DAAP industry is entirely commercial, but the effect can be political. As opposed to the DAAS industry, which may not fully rely on engagement for its revenue, operators developing disinformation as a product are primarily concerned with engagement metrics. These are used to generate ad revenue on YouTube and on their websites or sell merchandise from their e-commerce websites or through affiliate marketing links.
Artificial intelligence (AI) considerations
As sophisticated chatbots like ChatGPT become more commonplace, and as AI makes it dramatically cheaper, quicker and easier to create unique and appealing websites, production costs for commercial disinformation are likely to drop. It will also make it easier for operators to target consumers across linguistic and cultural divides. The operators, for example, may no longer need to speak or write passable American English to target American consumers, as chatbots can more easily generate linguistically and culturally appropriate content on their behalf that is convincing.
As we have seen in some of the examples above, AI has already been used by DAAS firms such as Predictvia, while DAAP schemes such as those selling pro-Trump coins have also used AI-generated videos to market their products.
Although the long-term impact of this technology remains to be seen, in the short-term it seems likely that there will be a spike in the number of commercially-motivated actors seeking to make money from targeting audiences with specific content, including disinformation and conspiracy theories. With very low costs and few barriers to entry, incentives to flood the online landscape with more disinformation will be incredibly high.
Sources for further reading
Disinformation as a service (DAAS)
Stephanie Kirchgaessner et al, “Revealed: The Hacking and Disinformation Team Meddling in Elections,” The Guardian, February 15, 2023.
Zeba Siddiqui and Christopher Bing, “Latin American Election Influence Operation Linked to Miami Marketing Firm,” Reuters, May 4, 2023.
Joe Uchill, “Disinformation as a Service Crosses Borders with Ease,” Axios, October 3, 2019.
Max Fisher, “Disinformation for Hire, a Shadow Industry, Is Quietly Booming,” New York Times, July 25, 2021.
Craig Silverman, Jane Lytvynenko, and William Kung, “Disinformation for Hire: How a New Breed of PR Firms Is Selling Lies Online,” BuzzFeed News, January 6, 2020.
Disinformation as a product (DAAP)
Elise Thomas, “Conspiracy Clickbait: This One Weird Trick Will Undermine Democracy“, ISD, January 2023.
Stuart A. Thompson, “How Trump Coins Became an Internet Sensation,” New York Times, January 28, 2022.
Brandy Zadrozny and Corky Siemaszko, “‘Trump Bucks’ Promise Wealth for MAGA Loyalty. Some Lose Thousands,” NBC News, May 27, 2023.
“The Fake News Machine: Inside a Town Gearing Up for 2020,” CNN, September 15, 2017.
Craig Silverman and Lawrence Alexander, “How Teens in the Balkans Are Duping Trump Supporters with Fake News,” BuzzFeed News, November 3, 2016.
_________________________________________________________________________________
This Explainer was uploaded on 14 June 2023.