Skip to Content
Skip to Navigation

The Backstory

Adrian Chen wins November Sidney for spotlighting the workers who keep ‘dick pics’ and beheadings out of your Facebook feed
November, 2014

Adrian Chen wins the November Sidney Award for “Unseen,” a Wired feature on the invisible army of contractors who spend their days sifting through all the porn, gore, and hate speech that users try to upload to social networks.

The industry calls them “content moderators” or “mods,” but some might call them censors. There are over 100,000 of them in the United States, and overseas. Social networking giants like Google are notoriously secretive about what content moderators do, and what criteria they use to do it.

Mods pick through all the child pornography, car accident footage and animal abuse videos that the world’s internet users attempt to share each day. Some services engage mods in response to complaints, others filter all posts in real time. Sometimes moderators have to bend the rules for content the company deems “newsworthy,” like footage of violent political protests.

For this work they may earn as much as $20 an hour in the U.S., or as little as $312 a month in the Philippines. The job takes its toll. Most people burn out within three to five months. The companies hire psychologists to evaluate prospective moderators and counselors to deal with their job stress, but PTSD-like symptoms are common.

“Chen is working on the cutting edge of labor journalism,” said Sidney judge Lindsay Beyerstein. “This story raises important questions about workers’ wellbeing and free speech on social networks. And it’s a fun read.”

Adrian Chen is a freelance writer in Brooklyn, New York. He is a contributing editor to The New Inquiry.

Lindsay Beyerstein interviewed Adrian Chen by email

Q: How did you become aware of the content moderation industry?

A: I'd written some for Gawker about various outcries that erupted whenever Facebook deleted a politically sensitive image or profile for violating their content guidelines, and I always wondered how they determined what to delete--not from a policy standpoint but from a technical one. I'd assumed that it was some sort of algorithm with limited human involvement. But while working at Gawker, I was able to interview a former content moderator for Facebook, who worked as a contractor for $2 an hour, screening often-horrific images from his home in Morocco. That's when I first realized that a significant amount of this work is being done by low-paid workers in developing countries, and as I looked into it more it became clear an entire industry is built on selling this labor to American tech companies and other businesses.   

Q: Why are these social media companies so reliant on contractors to do this work?

A: Obviously the biggest reason is cost. Moderation is a very labor-intensive process that needs to be on 24/7. You need a large number of workers, compared with the relatively small numbers of core employees tech companies have. Also, flexibility: Companies want to be able to quickly scale up or down their moderation capability depending on how fast they grow. So, they hire cheap workers in the Philippines, or young recent college grads in the Bay Area who don't get benefits and can be laid off without a hassle. Many companies will tell you that they outsource this so they can focus on their "core" business but this is disingenuous. If allowing people to share content on their service is their core business, moderating this content is just as central.

Q:  A lot of this work is done overseas. The wages are much lower, but so is the cost of living in these countries. Are these workers compensated fairly?

A: The content moderators I spoke to in the Philippines were paid in line with other workers in the outsourcing industry. One moderator for Microsoft ended up making $500/month after three years at his outsourcing company. This is pretty good in a country where something like a quarter of the population lives on less than $1 a day. Although as a low-skill, non-voice job, moderators typically make less than many people doing, say, phone-based tech support.

Q: The stress of looking at so much repulsive material wears people down. Can you get PTSD from this?

A: I'm not sure about the clinical diagnosis, but the two mental health experts I spoke to who have dealt extensively with content moderators say many exhibit PTSD-like symptoms: paranoia, "compassion fatigue" where they become sort of deadened to human suffering, depression, sexual dysfunction, etc. It of course depends on what kind of images they see, and how good the work environment is. I think it's also important to note that the pressure of having to keep up with their quotas was as hard for some workers as the images themselves. It's the combination of the repetitive, stressful work and the horrific images that can make the work especially degrading. And if they're only temporary contractors, the sense of being literally disposable also contributes to the stress.

Q: Facebook and other major players are very closed-lipped about their content moderation policies. Should users have a right to know what they're not seeing, and why?

A: Facebook, Google and Microsoft are absurdly opaque about the process of content moderation, considering how crucial it is to their business. I hope that my article has prompted journalists with better relationships with these companies to ask more questions about their content moderators.

From a basic consumer rights angle, people should know that their personal information is being sent potentially thousands of miles away, to a third-party about whose relationship to the client company almost nothing is known. There is also the more philosophical question of whether dumping the worst humanity has to offer onto an invisible army of low-paid workers so we can share vacation photos blissfully unaware is the best solution the tech industry can come up with for this problem. And if it is, how are these companies making sure these workers are cared for and fairly compensated for the very clear discomfort and risk they take on for the rest of us? We can't start to have that conversation without knowing who these workers are.

Q: A few months ago, the Hillman Foundation honored a feature about the so-called "Mechanical Turk," which connects tens of thousands of digital pieceworkers to big companies to perform repetitious tasks from home. Did you find any evidence that Turkers are involved in content moderation?

I looked into this, and there is definitely some lightweight moderation happening through Mechanical Turk--I even did a little of it. But it seems that any serious content moderation requires a captive labor pool managed by an outsourcing firm. The moderation is too complicated to have random strangers on the internet do it. (Which is also why hardly any of it is automated.)

Q: You interviewed an academic who said that social networking companies want to downplay the human work that goes into creating digital spaces. Why is that?

A: Would you want to lay out your personal trash for everyone to poke around in? Social media companies are hyper-sensitive to the appearance of harboring child pornographers, or encouraging cyberbullying, so I think there is a simple "see no evil" philosophy driving some of the secrecy. Then there are sensitivities around outsourcing, of course. More fundamentally, content moderation undermines the idea that tech companies are ushering in a new information economy built on code and ideas--not the unpleasant manual labor done by content moderators.

Previous Backstories

October, 2014
McClatchy and ProPublica win the October Sidney Award for “Contract to Cheat,” a year-long investigation showing that roughly 10 million construction workers are being misclassified as independent contractors. The scam harms workers and cheats taxpayers out of billions of dollars, but regulators have done nearly nothing about it.
September, 2014
The Post and Courier wins the September Sidney Award for “Till Death Do Us Part,” an investigative multimedia series probing South Carolina’s domestic homicide crisis.
August, 2014
Jay Root wins the August Sidney Award for “Hurting for Work,” a Texas Tribune series, which illustrates how laissez-faire policies created the “Texas Miracle” on the backs of workers.
July, 2014
Esther Kaplan wins the July Sidney Award for “Losing Sparta: The Bitter Truth About the Gospel Of Productivity” a feature in The Virginia Quarterly Review which describes how a Philips lighting plant in Sparta, Tennessee got off-shored to Mexico, despite being the most productive plant in the country.
June, 2014
Beth Schwartzapfel wins the June Sidney Award for “The Great American Chain Gang” a feature in The American Prospect that explores a vast but little-known workforce inside the U.S. prison system, where 870,000 inmates work full time with practically no rights at work.
May, 2014
Chris Hayes wins the May Sidney Award for “The New Abolitionism,” a provocative feature in The Nation in which he argues that fossil fuel companies must forfeit $10 trillion in wealth in order to save human civilization, a demand he says is no less urgent, and no less radical than the abolitionist ultimatum that slaveholders give up the vast wealth they held in human bondage.
April, 2014
Carol Marbin Miller and Audra D.S. Burch of the Miami Herald win the April Sidney Award for "Innocents Lost", an investigative multi-media package profiling some 477 Florida children who died after the state’s child protection authorities investigated their families for abuse or neglect but failed to take them into care.
March, 2014
Moshe Marvit wins the March Sidney Award for his Nation magazine feature “How Crowdworkers Became Ghosts in the Digital Machine,” which shines a spotlight on a hidden workforce in which millions of digital pieceworkers toil online at home for less than minimum wage, executing repetitive “microtasks” for brokers like Amazon’s Mechanical Turk.
February, 2014
Amanda Hess wins the February Sidney Award for her provocative Pacific Standard essay describing the abuse that female journalists disproportionately encounter online and the implications of this phenomenon for women’s equality in the public sphere.
January, 2014
Gabriel Thompson wins the January Sidney Award for The Caretakers, a profile of the Latino immigrants who toil in obscurity to keep the nation’s golf greens manicured. In an unusual move for a sports magazine, Golf Digest commissioned this piece, published in English and Spanish, to enhance their readers’ understanding of Latino immigrants in the golf industry.
December, 2013
Nancy Updike and Nikole Hannah-Jones win the December Sidney Award for House Rules, This American Life’s gripping, revelatory history of The Fair Housing Act, landmark civil rights legislation designed to reverse decades of racist housing policy and segregation. The program is based on Hannah-Jones' reporting for ProPublica.
November, 2013
FRONTLINE wins the November Sidney Award for League of Denial: The NFL’s Concussion Crisis, a major investigation that reveals what the National Football League knew about post-concussion brain damage, and when it knew it. This groundbreaking reporting shows how the NFL covered up the link between repeated concussions on the field and early-onset dementia.