Professor uncovers the Internet's hidden labour force

That work, according to one professor, needs to be unmasked to better understand a budding, yet evolving industry, one which houses workers whose job is essential, but whose nuanced needs are not adequately considered.

“Ultimately, there are people who exist between you and your social media platform of choice,” said Sarah Roberts, a professor in the Faculty of Information and Media Studies. “And that just dispels the notion of Web 2.0 applications, which were all about you relating to your platform one-to-one, being you, expressing yourself to the world, one-to-one.”

Roughly three years ago, thanks to a small New York Times article, Roberts learned of a group of people largely referred to as “screeners.” These individuals work for various companies – ranging from large Silicon Valley corporations to social media outlets to small business operations – to filter the commercial online world from inappropriate content, including things such as hate speech, abusive imagery, warzone footage and child pornography.

Think of companies like Google or Facebook. They have someone doing this kind of work, an industry Roberts has dubbed “commercial content moderation” (CCM).

Last year, Roberts appeared on NPR’s All Things Considered to discuss the CCM industry and its implications. Last month, she gave a talk at UCLA.

“I’m in my 21st year on the Internet and I’m surrounded by digital media scholars, really savvy people. I read this article and went to them and said, ‘Did you hear about these people doing this moderation work for profit set-ups?’ They had never heard of that,” she said.

“I couldn’t let go of it. I kept thinking about it – the implications of it, how far reaching is this, where is it taking place. If it’s in this sector, it must be all over. So, I started to pursue it. What I discovered is it’s really difficult to get people to talk to me. They are under non-disclosures for the most part.”

CCM is a disjointed practice that grew out of necessity. Roberts found, through connections and persistence, no unified trade groups or governing practices. Employees were contract workers, often working out of call centres. To prevent burnout, some were limited to a year on the job, followed by an optional three-month leave. If they wanted, they could renew for one year after that.

“I found workers interested in talking to me because they wanted somebody to hear about the work they do and its impact,” she said.

“I went into it thinking this was probably a difficult job, but there were things I couldn’t have imagined about it. I found out these are highly sophisticated workers, doing a task that, in an era of machine learning, mechanization and computerization, can’t be computerized.”

She couldn’t have imagined some of the implications or repercussions of working in the industry, Roberts explained.

“Despite this work being critical, it’s completely obfuscated and I think that’s by design. It’s an unpleasant reality that is necessitated by social media and it’s not something most companies want to trumpet. It’s the dirty underside and it has some real social costs, psychological costs that have yet to be measured.”

These employees were contract workers making less pay than others in the company. They had no health insurance or benefits, no support, no job stability, Roberts said. She heard of only one employee who opted for a second contract, another “tour of duty.”

“These people also didn’t bring up what they did at work (with their friends and family) not because of non-disclosure agreements but because it wasn’t something people wanted to hear about,” she added. “So, there’s the isolation factor.”

And as hard as the job was, it was necessary, and these workers knew that, she continued.

“They take a lot of pride in their work – they felt like they were performing a service for the good, so other people wouldn’t have to be subjected to what they saw.”

Roberts noted the employees had sophisticated views of concepts like free speech, seeing behind the veneer of a supposed ability to freely circulate information online. They see the human worst, and they see the material in a larger context, too, she said, noting a parallel one worker made between banned images from Mexican drug wars and images coming out of Syria which are widely circulated.

Further research is needed in this relatively new field, Roberts explained. She is interested in following CCM as it continues to be outsourced, especially to countries like the Philippines.

“The work is so nuanced and so specific with regard to human taste, cultural familiarity, sensibility, context – things you can’t program, never mind legal regimes and things like copyright,” she said.

“I want to follow the work as it follows a well-trodden globalization path and see what it means when this work is done in a different part of the world, where it’s destined for other audiences. We’re asking these moderators to make taste decisions. They’re probably not making taste decisions based on their own moral code or sensibility, but trying to embody someone else’s. That’s what’s fascinating. Whose?”

From Western News, March 20, 2014
By Adela Talbot