🇵🇰-Land - Understanding Troll Farm’s Working in the Arab World | Pakistan Defense Forum
Theme customizer
Revert customizations made in this style

🇵🇰-Land Understanding Troll Farm’s Working in the Arab World (1 Viewer)

Currently reading:
🇵🇰-Land Understanding Troll Farm’s Working in the Arab World (1 Viewer)

G Pakistan Land Forces

Old School

Senior Moderator
Moderator
Jan 26, 2024
1,541
642
1,000



"It Was Very Hard for Me to Keep Doing That Job": Understanding Troll Farm's Working in the Arab World


Abstract

This article investigates the production culture and routines of "troll farms" in three Arab countries—Tunisia, Egypt, and Iraq—from a production studies approach. A production studies approach enables us to focus on the working conditions of paid trolls. We employed qualitative methods to look inside the "black box" of Arab troll farms. From February to April 2020, we conducted semi-structured interviews with eight disinformation workers at both managerial and staff levels. We propose to understand disinformation work as a specific type of digital labor, characterized by very intense shifts and emotionally burdensome daily tasks, absence of legal job contracts, and highly surveilled work environments. The article contributes to understand disinformation practices outside and beyond the West; it situates disinformation activities within the broader context of digital media industries; it provides a detailed analysis of the features that distinguish troll farms in the Arab world from those that emerged in other regions of the Global South; and it reconnects the research on disinformation to digital labor studies.

Introduction

A troll farm, or troll factory, is an institutionalized group of internet trolls (De Seta, 2017) that seeks to shape the political agenda of a specific society and interfere in political opinions and decision-making processes. Troll farms became a global popular issue during the 2016 U.S. general election and the Brexit referendum (Pomerantsev, 2019). The role of troll farms in "computational propaganda" campaigns (Woolley & Howard, 2018) has flourished a vast popular literature that often ended up representing trolls as "folk devils" (Cohen, 1972; De Seta, 2017), responsible for igniting moral panics among society.
To avoid falling into these stereotypical representations, it is necessary to learn more about the phenomenon of trolling and disinformation for hire. To do so, we believe it is useful, as Cabañes (2020) argues, to adopt "an approach that can complement production-focused studies about this pernicious phenomenon" (p. 2).
To understand the phenomenon of digital disinformation then, in this article, we will adopt the perspective of production studies (Mayer et al., 2009) and focus on the forms of disinformation production in three Arab countries of the Middle East and North Africa (MENA) region: Tunisia, Egypt, and Iraq. The focus on the Arab world is motivated by the fact that there are relatively few studies on the disinformation strategies of troll farms in Arab countries, apart from a few, recent exceptions (Al-Rawi, 2021; Byman, 2021; Jones, 2019, 2021, 2022; Lamboley, 2019). Troll farms and disinformation for hire in the MENA region are not only under-investigated by academic research but they are also an uncommon topic both for Western and MENA legacy media.
The adoption of a multi-country approach in this article was not planned but rather inductively emerged from the fieldwork. Our first Tunisian informant put us in touch with his acquaintances who operated as paid trolls in other regions of North Africa and the Middle East, suggesting that the troll farm industries in the MENA region are not bound to national borders but are networked at different levels.
Yet, the adoption of a multi-country approach should not assume that political contexts and media industries are the same across Egypt, Tunisia, and Iraq. These countries do not share the same level of media freedom and the same media regulatory frameworks.
To understand the phenomenon of disinformation for hire in the MENA region, we draw from the work of Ong and Cabañes (2019), who investigated disinformation operations in Southeast Asia and were the first to propose a dialogue between media production studies and disinformation studies. The research made by Ong and Cabañes offers a relevant framework for a study on the MENA region for two reasons: (1) both the studies focus on countries where government leaders are the biggest bad actors in the disinformation industries; (2) in both cases, political and economic contexts lend themselves more easily to disinformation-for-hire dynamics compared to ideologically driven disinformation networks (Ong, 2021).
Production studies have a long tradition of "decoding" the sites and cultures of media content production and can complement disinformation studies by shedding light on the complex socio-technical network of actors and infrastructures that lies at the heart of online disinformation production.
Taking this approach will also enable us to understand paid trolls not as "folk devils" (Cohen, 1972) animated by populist sentiments or as a mercenary army in the pay of some authoritarian or corrupt state, but as precarious, reluctant, invisible, and underpaid digital workers of their domestic creative media industries. We will also show the relationships between troll farms and the media ecosystems to which they belong: far from being an external threat to the local media systems, they represent instead an invisible, but constitutive, part of it.
Given the difficulties we encountered in accessing the field, in the next section, we will provide a "thick" description of the data generation process on which this research is based, as these notes may be useful to future scholars. Then, we will move to discuss the main findings of our research, presenting disinformation work as a precarious, invisible, emotionally intense, and underpaid kind of job and disinformation companies as an integral part of domestic media ecosystems. In the conclusions we will foreground and discuss the four main contributions of this article: (1) it focuses on disinformation practices outside and beyond the West, with which the literature is overly preoccupied; (2) it situates disinformation activities within the broader context of digital media industries; (3) it provides a detailed analysis of the features that distinguish troll farms in the Arab world from those emerged in other regions of the Global South; (4) it reconnects the research on disinformation for hire to digital labor studies.

Research Design: Troll Farms as "Black Boxed" Research Field

Our research aimed at investigating the production culture and production routines of troll farms in the Arab world.
We employed qualitative methods to look inside the "black box" of troll farms. From February to April 2020, we conducted semi-structured interviews with eight disinformation workers at both managerial and staff levels (5 Egyptians, 1 Tunisian, 1 Iraqi, and 1 Turkish, see Table 1). Most of our interviewees, apart from the first two of them, are involved in producing comments and amplifying messages through different fake profiles. Their profile corresponds to what Ong and Cabañes (2019) have called "community-level fake account operators" (p. 5778), tasked with "sharing and amplifying core campaign messages in the online communities and Facebook groups they had infiltrated" (p. 5778).

Table 1. Composition of the Sample of Respondents.
InformantsNationalityAgeDegreeRole
#1 (Tarek)Tunisian32 years oldBachelor's degree in JournalismSupervisor
#2Egyptian29 years oldBachelor's degree in LanguagesSupervisor, former account operator
#3Egyptian30 years oldBachelor's degree in JournalismFake account operator
#4Egyptian35 years oldGraduated in PharmacyFake account operator
#5Egyptian21 years oldStudent in JournalismFake account operator
#6EgyptianHe refused to disclose personal information.He refused to disclose personal informationFake account operator
#7Iraqi30 years oldMaster's degree in Communication SciencesFake account operator
#8TurkishHe refused to disclose personal informationHe refused to disclose personal informationSupervisor, former fake account operator

Each interview lasted between 60 and 90 min, and in some cases, such as with interviewee 1, we had multiple conversations during the fieldwork. None of the interviewees agreed to share their phone numbers with us and we agreed to interview them through Telegram, except with interviewee 1, who was already a personal contact of ours. The interviews were done in Arabic, were transcribed, and then coded following a Grounded Theory approach (Charmaz, 2006). We entered the fieldwork with the first goal of understanding what exactly consists of the work of an Arab troll. The research questions emerged during the fieldwork. The iterative process of sense making of our interviews led us to identify three main topics: (1) the organization of work in troll farms based in Tunisia, Egypt, and Iraq; (2) the way paid trolls make sense of their job; and (3) the entanglement between troll farms and traditional domestic media.
Entering the field of Arab troll farms was by no means easy. As other scholars before us have already pointed out, finding respondents was a very "slow and tricky process" (Ong & Cabañes, 2019, p. 5777).
However, the difficulty in accessing this field is common to other fields in the media industry. As Seaver (2017) and Bonini and Gandini (2019, 2020) have shown, technology companies that operate online music streaming platforms are particularly reluctant to grant access to any kind of academic inquiry. Bonini and Gandini (2020) had described the entire field of technology companies developing automated recommendation systems as a "black box," due to their seemingly refractory nature to any academic research attempt not funded or approved by these companies. Drawing from the work of Bonini and Gandini (2020), we also define the field of troll farms as a black box: impermeable environments, difficult for researchers and journalists to access, not subject to any public vetting process.
Yet, production studies scholars have the tools to explore these black boxes: as Bonini and Gandini (2020) remind us, "media studies have a strong record in 'unpacking' black boxes like newsrooms and sites of media production" (p. 1). If we consider troll farms as an emerging field of production studies, we can apply to this emerging black boxed field the same research toolkit we normally use in media ethnography (Murphy, 2011).
Therefore, taking inspiration from the work of scholars who have preceded us in this long process of unraveling the production cultures of the media industries (Gans, 2004; Mayer et al., 2009; Paterson et al., 2008), we have tried to apply their same tactics to access our field. Particularly useful here were some of the tactics described by Bonini and Gandini (2020), like relying on personal contacts, focusing on ex-workers, and treating "interviews as fieldwork" (Seaver, 2017).
After a series of unsuccessful contacts on Twitter and Facebook groups, one of us tried to reach out to a classmate from her college days. Tarek (his name is fictional) had studied on a BA in Journalism and Communication with one of the authors at the University of Tunis, in 2014. In 2019, Tarek had proposed to one of us to be part of his team, which was working on a "political communication" project (as he called it). This project was about the Tunisian presidential elections and consisted of promoting the image and reputation of two political parties and their presidents-candidates in the elections.1 His proposal had been rejected, but one of us had kept in touch with him and contacted him again at the end of 2019. Tarek was our first interviewee and was the key to our access to the field. Starting fieldwork from one's personal contacts is very common in ethnographies of places of production. Gans (2004), in the introduction to his classic Deciding What's News, admitted that his first informants had been journalists introduced to him by "a friend from college days" (p. 76). Even Hannerz (2002), in his ethnography of foreign news correspondents, sustained that access to the field is more and more dependent on the entanglement between researchers and "the people in our fields" (p. 58).
Tarek then played a very crucial role in our fieldwork since he opened us the door of the trolling industry in Egypt. He put us in touch with a supervisor who at the time was working for an Egyptian troll farm and with another Turkish supervisor he had met during a troll farm supervisors' training he had attended in Turkey, in August 2019. The latter was a former Aktroller.2 We interviewed him on February 19, 2021; however, some days later he asked us to totally forget about what he said and then he blocked us on Telegram.
The Egyptian supervisor (interviewee 2) then put us in touch with the third and the fourth informant. The latter in turn put us in touch with two of his other Egyptian co-workers, our informants 5 and 6. The latter, however, was very terrified of revealing details of his work, despite the fact that we assured all interviewees that we would maintain their anonymity and securely store our recordings. We made it clear to all interviewees that our goal was neither to judge them nor to make them feel uncomfortable about their work and that we just wanted to understand how they work and how they make sense of it. Despite these reassurances, informant 6 initially did not agree to be interviewed, then reconsidered it, but then he did not show up for the appointment we set on Telegram. We were ultimately able to speak with him only in July 2021.
The contact with our last informant came from another personal contact of one of the authors, an Iraqi journalist, who introduced us to someone who worked for a local troll farm in Baghdad, Iraq.
Although we had begun to sense a saturation in our interviewees' responses, we would have liked to extend the number of informants further, to complete the theoretical sampling process (Charmaz, 2006). However, subsequent contacts provided by our first respondents did not agree to speak with us and we found ourselves stranded again. We therefore recognize that our findings are exploratory in nature and may serve to shed the first light on a field of research that needs more attention and further research.

Findings

In this section, we focus on those who work behind the scenes of troll accounts and disinformation production in three Arab countries: Tunisia, Egypt, and Iraq.
We think that drawing attention to their stories, their motivations, and their socio-cultural backgrounds, and chiefly considering them as digital media laborers is central to the understanding of troll farms as specific sites of media production.
Three main thematic clusters emerged from our fieldwork: (1) the organization of work in troll farms based in Tunisia, Egypt, and Iraq. We describe troll farms' hierarchies and the specific kind of job that fake account human operators do; (2) the way trolls make sense of their job; (3) the entanglement between troll farms and traditional media.

Inside an Arab Troll Farm: How Does It Work?


Troll Farms' Hierarchies

Over the course of our exploratory work, our informants revealed to us that a troll farm—in Egypt, Iraq, and Tunisia—is structured along a very clear hierarchy of professional roles. We found that in Iraq, as in Egypt or Tunisia, troll farms have similar hierarchies, which are very close to those described by Ong and Cabañes (2018). The production of disinformation, as Ong and Cabañes (2018) have rightly noted, is the result of "loosely interconnected groups of hierarchical digital workers" (p. 15).
At the top of the pyramid are the supervisors, who manage the negotiations with "clients." They also monitor the progress of the work of the trolls. The supervisors can arrive to earn approximately up to 20,000 dollars (the equivalent of the cost of a car Kia Rio in Tunisia3) for every single disinformation campaign, or "project," as they call it.
Supervisors are professional figures very similar to the "chief architects of networked disinformation" described by Ong and Cabañes (2018). We adopted the term "supervisors" because it was used by our interviewees to describe themselves or their bosses. Unlike Ong and Cabanes' study, however, the two supervisors we spoke with share a background in journalism and confirmed to us that this position is often filled by professionals with a wide background in media industries, especially broadcast, print, and online journalism, not only advertising and public relations firms, as was the case of the leading figures of Philippines' troll farms described by Ong and Cabañes (2018).
For example, Tarek (interviewee 1) is a well-known Tunisian journalist who works for a famous Arab television network with headquarters in a big European city. Unlike his colleagues, Tarek avoided using words like "troll," "trolling," "fake news," and "manipulation." He gave us the impression that, in some ways, he was trying to distance himself from the stigma around trolling. He bragged about leading a team of 12 people, and his narrative was more focused on his work as a journalist and "political consultant and strategist":
"First of all, my job as a journalist consists of gathering, assessing, creating, producing, presenting, and spreading news and information, and I can say that I perfect my craft the best way! My work with political candidates in Tunisia is part of my job as a Tunisian journalist. I also help candidates and parties to better inform their voters and public in general, to convince them and to make sure the right message has reached the right person, in the right way" (Interviewee 1).
Tarek has always been very proud of his work and has never used the words "troll," "fake news," and "bot" to describe what he does. However, his behavior with us was very contradictory: on the one hand, he was keen to distance himself from the troll farms' world, but on the other hand, he put us in contact with people working as disinformation operators in Egypt and Turkey. If he did not work as a supervisor of disinformation agents, why did he tell us that he took part in a week-long training for supervisors of fake account operators in Turkey? He never answered this question. Moreover, Tarek described to us in detail the professional figures that work inside a troll farm, and his descriptions were confirmed by all the other disinformation operators we talked to.
Just below the supervisors are the "target logistics managers." Our interviewees described them as experts in determining the target audience and the suitable strategy for each client and for each platform. According to Tarek, in some troll farms, the supervisor and the target logistics can be the same person.
Under them, there is the figure of the technical manager, an expert in software and technological programs/services. He is the one in charge of creating the fake accounts and the software to automate them and provides technical assistance to the disinformation operators. Only supervisors have a direct relationship with all workers. Sometimes, fake account operators are not even aware of the existence of technical managers and target logistics managers.
Supervisors are "public" figures: they meet politicians, work for advertising and public relations agencies, write for newspapers, or work for television networks. They are professional figures recognized as part of the domestic creative and cultural industries and they have well-established positions in media production companies. In the language of production studies, we could call them "above the line" professionals (Mayer, 2011). To be "above the line" is to be a creative professional, as well as the traditional object of media production studies. Everything below them, on the other hand, belongs to a gray area of media industries. The workers who operate different accounts and produce false or heavily biased information constitute the invisible, underground, hidden part of the advertising, political marketing, and public relations industries. These jobs could be likened to the "below the line" professionals in the television industry described by Mayer (2011): all those media workers who do not perform creative tasks and whose role in the media production process has long remained invisible and under-investigated.
We could therefore conceive of troll farms as having a shape similar to that of an iceberg (see Figure 1), in which the emerged part is represented by the presentable "face" of a well-established professional of the media industry, while the lower, submerged, and "below the line" part is the actual troll farm.
10.1177_20563051231224713-fig1.jpeg
Figure 1. Troll farms' hierarchy: "iceberg" model.OPEN IN VIEWER
This hierarchical model shares many continuities with the one described by Ong and Cabañes (2018) in the Philippine case. The professional roles of both models are very similar, but in our case, the figure of "anonymous digital influencers" (Ong & Cabañes, 2018)—the aspirational, middle-class digital workers moonlighting as operators of anonymous accounts that command 50,000 or more followers on Facebook and Twitter—seems to be missing. We cannot rule out their existence, and more studies may detect their presence, but our informants reported to us a hierarchical architecture based on a sharp division between "above the line" workers and a kind of blue-collar "proletariat"—the fake account human operators—that simply carries out the orders it receives every day from the professional figures at the top of the hierarchy.
Bots are situated at the base of this iceberg-like shape. They perform repetitive and automated activities using a common script with the aim of spreading disinformation, promoting specific narratives, amplifying misleading messages, distorting online discourse, and creating fake trends and fake engagement (Howard & Kollanyi, 2016). They can be easily detected by platforms' algorithms, and for this reason, any disinformation campaign also needs human labor.
Bots automate the activity of fake profiles created by troll farms, but these profiles can also be managed by human operators. In this case, we are talking about "cyborg-like" accounts, partly human and partly machines, or "semi-automated accounts" (Nimmo, 2019). These profiles are periodically used by human operators to prevent platforms from detecting them as bots and to create the illusion that their behaviors and interactions are truly authentic. Bots, cyborg-like accounts, and human operators are key actors of the global "infrastructures of impostering" (Lindquist, 2021): they are all "imposters" who try to pass themselves off as a person they are not. Impostering, as argued by Lindquist (2021), is a socio-technical system, made by human and non-human imposters. Non-human imposters (bots) alone could not easily "game" the policing algorithms of Twitter or Facebook. On the contrary, the complex entanglement of human and non-human work is critical to the success of disinformation campaigns (Bonini & Gandini, 2020).
One of our Egyptian informants told us that in the case of the defamatory attacks against Egyptian soccer star Mohamed Aboutrika, bots, semi-automated accounts, and human operators were employed. Aboutrika is one of Egypt's most famous and beloved soccer players, but in 2017 the Egyptian government blacklisted him as a terrorist because he publicly expressed his support to the Muslim Brotherhood (BBC, 2017). Trolls paid, allegedly, by Egyptian state agencies, were tasked with producing a climate of opinion favorable to the government's decision by smearing the former player on social media. The trolls received the task of producing a huge wave of hateful comments against him on his Facebook official account and on the Facebook pages of Aboutrika's supporters. This volume of negative comments came from both bot-operated accounts, human-operated accounts, and semi-automated accounts. Our Egyptian interviewees told us that they employ a mixed strategy: bots spread messages against the soccer player on pro-Aboutrika pages, while they operated numerous fake accounts to respond to comments previously spread by their bots.
The activity of fake accounts is therefore a partly automated, partly human activity. The software allows to extend and speed up the number of comments produced, while the humans add the illusion of authenticity.
Even in this specific case, we can confirm what other scholars have already noted in other types of platform work: there is no clear dualism between human and non-human work. Trolling activity cannot be defined as only human or only automated, on the contrary, it is both human and automated. Bonini and Gandini (2019) had unveiled the complex entanglement between human and non-human labor in the music curation practices of streaming platforms. We also find this entanglement between human and non-human labor in the troll farms industry. We could define trolling activity as both (1) a human activity "augmented" by algorithms and (2) a non-human activity designed, monitored, and edited by humans.

Fake Account Human Operators

In addition to fake profiles, temporarily managed by humans or bots, there are profiles that are animated only by human operators, a category to which most of our interviewees belong.
They normally observe work shifts of 8 hr each, from 8 am to 4 pm, from 4 pm to midnight, and from midnight to 8 am.
The environment is usually predominantly male. Tarek, the Tunisian supervisor, had four girls on his team of 12, but other informants in Egypt and Iraq told us that they never worked together with other women.
The masculinity of these work environments is another characteristic that makes the Arab troll farms industries very different from those described by Ong and Cabañes (2018): while the former seem to be dominated by males at every level of their internal hierarchy, transgender and LGBTQ+ workers occupy powerful roles within the hierarchy of Philippine's factories.
Usually, all troll farms workers don't have time even to talk with each other, as they enter a productivity-oriented working environment: "When you arrive at work, all that you hear is the sound of the keyboards, which would stay in your head even after leaving (the office)!" said interviewee 5. Interviewee 6 described his workplace as a "very controlled set," similar to "a dark prison" with a strong security system and cameras in and outside.
In Egypt and Iraq, when a worker starts his shift, he receives a list of topics on Telegram that the supervisor wants him to work on. Each worker is placed in a Telegram group created by the supervisor. In these groups, workers receive not only the list of daily topics but also the frame to use in the messages and the type of emotional state they should provoke in the target audience.
There are usually several assignments per day and supervisors provide key topics on which comments and online conversations must be based. To ensure greater security, anonymity, and privacy, Arab troll farms tend to avoid the use of browsers like Google Chrome and Firefox. Egyptian and Iraqi interviewees have mentioned the use of Tor Browser.
Human operators do not know the passwords of the fake accounts from which they must operate, because when they enter work, they find their computers already logged in and their accounts are ready for work. The only ones who know the passwords are the supervisors.
Interviewee 7 was tasked with attacking and intimidating Intidhar Tarek Jassim, a candidate in the 2018 parliamentary elections in Iraq. Over the course of a day, he produced 300 comments on Facebook, using 10 different accounts, as well as sharing videos on You Tube from six different accounts.
After receiving tasks via Telegram and after finishing all their work, human operators are called to fill out an Excel sheet with the following details: the number of computers the trolls worked on, the date, how many comments they wrote, how many articles they published, and how many conversations they created. All this information must be documented with links to their posts and comments.
Fake account operators are asked to create engagement at any cost. Supervisors also use gamification practices to push them to achieve set goals: they try to make a tedious job into a game in which the hardest worker gets rewarded first. For example, as interviewee 4 mentioned,
"in interviews and trainings they (the supervisors, n.d.a.) convince us that this is not a job like any other, it is a passion that requires more effort and more attention. We must work hard and do our best. We need to engage more users. Usually the supervisor(s) monitors the progress of the posts we create. The post that receives the highest number of comments, likes and shares gets rewarded."
A human operator is required to have a certain level of language skills as he must write comments, posts, and tweets in his own words. Copy-paste from common scripts is strictly forbidden as it is left to bots.

"I Do It for Money": Why People Become Trolls

People who work in troll farms did not dream of doing this job in their lifetime. It often happened that they answered ads for other types of jobs such as copywriters, online journalists, back-office, and customer service, behind which, in reality, were hidden jobs as fake account operators. In the case of interviewee 7, for example, the Iraqi troll farm where he worked introduced itself in a Facebook-sponsored post as an American online public relations agency based in Baghdad looking for a copywriter with English skills. Interviewee 7 was hired immediately and initially was extremely happy: "Working in the social media environment as a copywriter at an American company was a dream for me." However, a few things he noticed when he started working there made him suspicious. For example, he noted that the company didn't even have a sign at the entrance and he was paid in cash:
"When I pointed these things out to them, they replied that they had not yet found a name since it was a new-born company, while as far as the salary was concerned, since it was a new American company, they told me that to open a bank account in Iraq required more time and financial operations were temporarily blocked, so they couldn't pay me through a bank transfer. This began to seem strange to me, however, I kept quiet because I had finally found a job after years of waiting!"
Most of our interviewees who work as fake account operators do this job for the money or because they are unemployed even after years of study. Our interviewees told us that they earn between 300 and 500 U.S. dollars per month. They are all hired without a legal contract and receive cash money in local currency.
Financial motivations were the first reasons our respondents cited, and this resonates with what Ong and Cabañes (2019, p. 5783) observed among paid troll workers in the Philippines.
Yet, choosing to take jobs like these also depends on another reason: most of the fake account operators we interviewed have degrees in Journalism and Communication studies and aspire to work in the media industry. Often, for many of them, this job represents their first or only professional experience available in the media world.
Interviewees 2, 3, 4, and 7 added that they do it because this job pays them well so that they can support themselves and not ask their parents for more money. Interviewee 5, for example, was looking for a job that would allow him to continue his studies and not be a burden to his parents who already have four other children to raise.
In the case of interviewee 7 (Iraq) instead, there were also other more personal reasons beyond financial ones. He told us that in his culture
"if you are 30 years old and you don't have a job you are considered a failed man, because you can't support a family. You can have all the degrees in the world, but if you are not married, if you don't have children and a job, you are not worth anything to Iraqi society."
That is why he took this job.
Tarek's case (interviewee 1), however, is different. Money is not the only reason that motivates him, although he clearly told us that "where there is money, there is also me." According to him, hanging out with political figures and being their advisor means being able to boast a prestigious and powerful position in society. He has never admitted to working in a troll farm, but at the same time, he has boasted several times that he knows how it works. Tarek occupies a front-stage position in Tunisian politics, and prides himself on knowing what happens in the backstage.

"Guilty and Ashamed": Making Sense of Being a Troll

All our interviewees apart from Tarek admitted that they were ashamed to do this job. They clearly and repeatedly said that they felt guilty. None of them felt comfortable talking to us about their work, and they agreed to talk to us only after we insisted and made it clear that we did not want to judge them, but only understand them. Yet, moral justifications work differently for people at different levels of the hierarchy. Fake account operators tended to feel more guilty doing this job and they were most likely to quit, while supervisors like Tarek didn't express any kind of moral qualms. The case of the Egyptian supervisor, indeed, was slightly different: he manifested his sense of guilt but, nevertheless, he has been continuing to do the same job since 2016.
Interviewees 3 and 7 felt bad while recalling some episodes and we decided to stop the interview. Interviewee 7 told us that after the attack on Iraqi candidate Intidhar Tarek Jassim, in which he had to accuse her of being a prostitute, he was very sick and decided to quit his job, even if it meant going back to being seen as a "failure" by his family.
Interviewee 3 told us that he took part in the social media attacks on Saudi journalist Jamal Khashoggi and felt very bad about what he had to do:
"Khashoggi was killed in his country's consulate in Istanbul, and we, on Telegram, were asked to continue defending the Saudi regime by attacking the narratives of the Turkish police. It was very hard for me to keep doing that job, I felt frustrated and guilty. It was all stressful because I felt that I also killed him in some way, so I decided to quit" (Interviewee 3).
As noted by Ong and Cabañes (2019) in other geographical contexts, in our case too troll farms' workers have been reluctant to carry out the tasks received from above. They are aware that they have participated in despicable and unjustified attacks on public figures and opponents of the state, and all have expressed shame about it. Troll farms should therefore be conceived not as places where an army of fanatics work for ideological reasons, but as a place of production of digital content where workers, in the absence of better opportunities, reluctantly accept to do a job they do not like to be economically independent.
The awareness of doing an extremely ignoble job is also demonstrated by the fact that many of them used the term "الذُباب" (mosquito in Arab) to define their work and their professional figure. The first to use this term in a derogatory way was Tarek, our first interviewee. Tarek used the term blue mosquitos (الذُباب الأزرق) to describe the digital supporters of the Tunisian Ennahda4 party. Blue is in reference to the color of the party logo, while mosquitos because he said they behave like a mosquito. During our fieldwork, terms like "mosquitos" and "online mosquitos" have been used and repeated many times by the interviewees. These terms were employed by them to describe either automated (bot), semi-automated, or human fake accounts online. Comparing disinformation operators to mosquitos recalls the nature of the role they are supposed to fully fit in; a very annoying and insistent role that aims to hijack the attention of the online public sphere. Interviewee 2 explains the meaning of this term: "mosquitos can be one of the most annoying things ever. In short, a very annoying behaviour and it is very similar and close to what we are called to do." According to interviewee #2, the term "online mosquitos" became popular and expanded in 2017 during the diplomatic crisis with Qatar, known as the "Gulf Crisis" (Jones, 2019) between Qatar and four Arab countries: Saudi Arabia, Emirates United Arabs, Bahrain, and Egypt. These latter declared the boycott of diplomatic relations with the state of Qatar. Interviewee #2 remembers that
"at that time, the stream of messages on Telegram didn't stop at all. We tweeted all day long. We made posts against Tamim bin Hamad Al Thani, others against Al Jazeera, known at that time as 'Al Khanzira' (the pig), others against the Egyptian, Emirati and Bahraini communities based in Qatar who decided to stay there: we described them as infidels, as slaves, as money lovers, as mercenaries and as unpatriotic."
Mosquitos in the Arab culture are perceived as a disturbing and inevitable element of everyday life, often accompanied by constant nuisance of buzzing, and the cause of several diseases. Generally, mosquitos appear only in specific moments of Arab life, when the weather gets warmer, that is when stagnant water accumulates. Similarly, online mosquitos appear as a swarm during momentous events to mobilize, polarize, and manipulate public sentiment.

The Entanglement Between Troll Farms and the Arab Media Ecosystem

Troll farms are not companies that operate outside of the media system. As far as we understood from conversations with our interviewees, these companies are an integral part of domestic media ecosystems. Our interviewees did not know the owners of the companies they work for, but they were convinced that they were funded by state apparatuses and private and public mainstream broadcasting media. The role of troll farms is complementary to that of mainstream TV channels, which support the government. Interviewee 5 is convinced of this:
"After 10 years since the revolution of 2011, I can assure you that we are still fighting against the same environment, an environment marked by censorship and repression in which traditional media play a leading role in shaping the public opinion and the international media agenda. Post-revolutionary regimes continued using governmental and private traditional media to silence critical voices. The situation worsened after the birth of the so-called troll farms. The state discovered a new tool to continue its propaganda taking in consideration that traditional media has lost some of its credibility especially among young people" (Interviewee 5).
While traditional media targets an older population, social media is used to reach out to younger audiences. In this hybrid media ecosystem (Chadwick, 2017), troll farms represent a continuation of state propaganda through other means. Interviewee 5 noticed this intermingling of media, state politics, and troll farms when he came home from work one night, turned on the television, and heard that the host of the political TV talk show he was watching was using the same topic and the same agenda he worked on the whole day at the troll factory to attack Mohamed Ali.5
Interviewee 5 explained to us that Mohamed Ali's videos brought chaos to the troll farm where he worked. For months, his energy and time were dedicated above all to this case; "we were called to write hundreds and hundreds of tweets and posts on Twitter and Facebook, using very specific hashtags for Twitter like محمد-علي-خاين# (#Mohamed_Ali_traitor), احنا-١٠٠-مليون-سيسي # (#We_are_100_million_Sisi), متحدين-ضد-اعدائك-يا-مصر # (#United_against_your_enemies_Egypt), نعناعة-بتهدد-الجيش # (#Nanaa_is_threatening_the_army)."
In Tunisia, Tarek told us that not only do political parties like Ennhada employ "mosquitos," but also radio stations like Mosaïque FM6 and IFM do it. They use groups of paid trolls to create fake engagement and fake trends.

Conclusion

Studies on disinformation industries and practices in the Global South still represent a minority compared to the strand of literature dealing with disinformation in the Western world. This article, dealing with data coming from the research context of Global South, contributes to expanding the Euro-American framing of disinformation and trolling and situating disinformation activities within the broader context of digital media industries. Framing troll farms as a component of the broader ecology of digital media industries helps to reframe paid trolls as digital workers and to reconnect the research on disinformation for hire to digital labor studies. Digital labor studies in the Global South are growing rapidly (Abilio et al., 2021; Anwar & Graham, 2021; Graham & Ferrari, 2022; Soriano & Cabañes, 2020) but are mostly focusing on the gig economy's jobs. With this article, we wanted to show how useful can be to adopt the production studies perspective to understand an under-investigated form of digital labor in the Global South.
Similar studies already exist, but this is among the first ones to adopt this approach in the context of the Arab world.
What differentiates Arab troll farms from those operating in other Global South countries, such as the Philippines (Ong & Cabañes, 2018)? There are mainly three differences: (1) Arab troll farms are a predominantly male environment, with a very low presence of women and LGBTQ+ people, unlike Filipino work environments. (2) The distinction between "above the line" and "below the line" workers typical of media industries (Mayer, 2011) also exists in troll farms, but in our case, this distinction is very sharp. Among the professional figures that distinguish Arab troll farms, there is a lack of "middle-class" workers, which are instead present in Philippine troll farms. (3) Supervisors of Arab troll farms have a background in media industries, mainly in journalism, and not only in advertising and PR firms, as noted by Ong and Cabañes (2018).
As it happens in other media industries, in the market of industrial production of disinformation too there is a large portion of invisible, precarious, and emotionally intense work, which we have assimilated to the "below the line" jobs of other media industries. We have shown how fake account operators work in extremely precarious conditions, without a legal contract, are highly surveilled, and subjected to very intense shifts and emotionally burdensome daily tasks. In socio-economic contexts that do not offer great professional career opportunities to young graduates in media, journalism, and communication studies, they often find themselves forced to accept a job that are ashamed to do. These job choices are never made lightly and have psychological consequences on the workers themselves, who can't bear the emotional burden of having to generate waves of fake hatred toward other human beings. The emotional and psychological burden of this type of work is similar to that of another emerging category of invisible workers in the media industry, that of content moderators on digital platforms (Gillespie, 2018; Roberts, 2019). The production studies approach is useful in this context because it allows us to relate the invisibility of fake account operators to the invisibility of other forms of media work (Bonini & Gandini, 2015) and thus shows the many similarities between this work and other types of media industry professions. Working in troll farms can thus be configured as a specific type of media and digital labor, requiring good language and writing skills, good knowledge of social media affordances, and a background in media, journalism, and communication studies. Yet, unlike other media jobs that require similar skills, the "troll" profession is a form of "underground" digital media labor performed in precarious conditions.
If public relations and marketing agencies represent the visible and socially accepted side of the media industry, troll farms represent the submerged, underground, and invisible one, but, as we have shown in Figure 1, they are part of the same "iceberg."
What finally emerges from our study is mainly the continuity between troll farms and the media industry. Troll farms are not an external and independent phenomenon of national media systems, on the contrary, they represent an emerging sector of the media industry and are strictly related to it.
Troll farms are a direct emanation of the marketing and advertising industries, as Ong and Cabañes (2018) already noted: "Digital political marketing, including the troll work of creating fake news accounts and authoring 'fake news,' represent the continuity of—rather than radical departure from—existing logics and processes of political marketing" (p. 27).
Our research shows that digital disinformation strategies should be seen as an extension of traditional media power into the hybrid media system. The framing and agenda-setting strategies traditionally employed by broadcast journalism and the press are now extended to social media through the "weapon" of troll farms. In a hybrid media ecosystem, traditional agenda-setting strategies are no longer sufficient, because public opinion emerges from the complex entanglement of legacy and digital media. To publicize a certain agenda or frame, political actors therefore resort to traditional intermediaries (television journalists, public relations, and political marketing agencies) to "militarily" impose a narrative frame across broadcasting and social media. Social media represent the most recent battleground where political actors fight their long-standing propaganda wars. Wohn and Bowe (2016) have already demonstrated the contribution of digital platforms to the social construction of reality and their ability to act as "micro-agenda setters."
Troll farms should be then conceived as the by-product of a complex socio-technical assemblage of actors, including public relations agencies, digital marketing companies, political actors, users' data, algorithms, digital platforms, media regulation policies, and traditional media.
If we want to tackle this phenomenon, it is necessary, among other things, to act on many different levels: first, on the regulation of digital platforms, whose algorithmic affordances favor engagement at any cost; second, on the regulation of electoral campaigns and on the transparency of the funding they receive; third, on the adoption of industry policies that foster the existence of a healthier creative media industry, capable of producing decent working conditions that make job offers from disinformation agencies less attractive.
 

Attachments

  • 1714624177068.gif
    1714624177068.gif
    42 bytes · Views: 0

Users who are viewing this thread

Reply