The human cost of keeping your social media streams clear of offensive material
In fall 2014, WIRED magazine published a feature about labourers who keep disturbing content out of people’s social media feeds. Facebook, Google, and Microsoft gave the magazine vague statements about protecting users but declined to discuss specifics about how they moderate their services. The article noted that many tech firms make their content moderators sign stringent non-disclosure agreements.
FIMS Assistant Professor Sarah T. Roberts told the magazine, “I think if there’s not an explicit campaign to hide [this practice], there’s certainly a tacit one.”
Roberts coined the term “commercial content moderation” (CCM) to unite the type of practices, often hidden, that comprise human review of user-generated content destined for social media platforms.
A frequent commentator on CCM, Roberts observes that it dispels the romantic notion of the Internet being a forum of democratic free expression. For companies requiring CCM, such services can be considered proprietary information, integral to the “secret” practices and production of that company’s social media, she says.
A less tangible and more disturbing issue, Roberts notes, is that hiding such practices comes at the cost of users assuming that machines take care of any moderation that might happen, which raises several significant implications.
“One, there is no understanding that human beings might be undertaking work that has the potential to harm them. But also, the notion of machines doing the moderating has implicit suggestions about the use of algorithms, and how the ‘best’ information must get through filtering mechanisms to make it to the destination site - as if human beings aren’t responsible for the creation of algorithms and the software and hardware that run them.”
There are a number of barriers to research in this field, including the challenge of finding CCM workers to interview because of non-disclosure policies. Roberts acknowledges that CCM is an umbrella term she uses, and not one that workers involved in the practice use themselves.
Moreover, such labourers are “dispersed around the globe in all kinds of different work environments, from working on-site at a major internet firm in Silicon Valley, to working in call centres in Iowa, Gurgaon, or Manila, contracting for boutique social media management firms, or even doing digital piecework through platforms like Amazon’s Mechanical Turk. This kind of stratification and global dispersal makes it a challenge just to identify who is practicing CCM and where they are located.”
With the support of a Seed Research Grant from Western University’s Social Sciences and Humanities Research Board, Roberts and LIS PhD student Andrew Dicks travelled to Manila in May, 2015 to conduct a pilot study with Filipino workers. In previous research, which focused on a North American context, Roberts found that there was an increasing tendency to outsource CCM work, particularly to the Philippines. She and Dicks interviewed five CCM workers employed in major corporate call centres in Manila, met with other individuals involved with the call centre industry and culture there, and visited a call centre company headquarters.
“The trip made it clear that follow-up work in sites like the Philippines is essential, in order to understand more about CCM work and workers, but also about the global flow of digital labour, labour in general, and global capital,” says Roberts, who’s currently at work on a manuscript called “Behind the Screen: Digitally Laboring in Social Media’s Shadow World.”
In the forthcoming book The Intersectional Internet: Race, Sex, Class, and Culture Online
, Roberts has written a chapter in which she ties CCM to issues of race and gender, and outlines the challenges presented in outsourced CCM labour. Workers may have to embody a set of values varying from their own moral codes and personal and cultural values while curating content destined for place or audience different from the workers themselves.
“Working in CCM means putting aside one’s personal belief system and morality,” said one long-time CCM worker (now manager) to Roberts, who added, “it also means that workers are exposed to content that is personally deleterious or damaging, including content that may impugn the CCM worker’s own identities.”
CCM labourers frequently encounter a wide range of depraved material, among them warzone violence, acts of terrorism, child pornography, animal abuse, and hate speech.
As they sift through lurid content, CCM workers consider profits when making taste decisions. What happens when a video clip or picture violates a site’s guidelines, but also a viral sensation?
“CCM workers find themselves in a paradoxical role, in which they must balance the site’s desire to attract users and participants to its platform – the company’s profit motive – with demands for brand protection, the limits of user tolerance for disturbing material, and the site rules and guidelines,” wrote Roberts. (A pre-publication copy of this chapter is available on the Western Libraries website).
Roberts’ study of CCM labour is currently focused on comparative research into other sites in the world. She says it’s very difficult to make generalizations about CCM working conditions as they vary by country and company. She has observed that CCM involves some sort of compromised work status - employees may be working part time or on contract. Such designations make it more difficult for workers to access full benefits, including counselling. In a 2010 New York Times article about Internet content reviewers, one psychologist observed that such workers were likely to become depressed or angry, have difficulty forming relationships, and experience decreased sexual appetites.
“The companies that require CCM want to distance themselves from the work that their platforms necessitate, and they further treat it as low-grade and relatively low-skill work - meaning, of course, lower pay,” says Roberts. “Yet my research shows that CCM work is essential to the production process of social media, and plays a major role in brand protection for social media firms and for firms who engage their customers through those venues. So there is a major disconnect there, as there often is with all kinds of essential, yet unglamorous and even dangerous, work.”