Earlier this year, when Facebook reached a $52 million settlement with lawyers representing more than 10,000 former and current content moderators in four U.S. states, their colleagues in India and the Philippines were entirely left out.

In their case against Facebook, workers in the United States had claimed that examining content on the social media giant’s platform, which can include screening for everything from child pornography to videos of terrorist decapitations, had led to severe psychological consequences. And like their American colleagues, Asian workers, thousands of whom are employed as content moderators by outsourcing firms like Genpact and Cognizant, have been reporting grueling working conditions that expose them to a stream of violent, graphic content, leaving some of them with lasting mental scars.

Facebook employs more than 15,000 content moderators globally. While the lawsuit provided relief for some moderators in the U.S., it did not address the plight of those who worked for the company abroad. 

“In the U.S., people can speak up for their rights,” one former content moderator, a 25-year-old man in India who requested anonymity, told Rest of World. “But it’s not the same here.”

Read: The trouble with scale: why not all growth is good

The outsourcing boom began in India and the Philippines in the late 1980s and 1990s, when Western technology companies began offloading back-office functions like customer service or IT to countries with well-educated but considerably cheaper workforces. Today, business process outsourcing (BPO) companies employ more than 1.2 million people in the Philippines and more than 1.1 million in India. When Facebook began outsourcing its moderation efforts in the late aughts, the two countries established themselves as hubs for BPO, most notably call center work.

Unlike U.S.-based moderators, for whom the job is often temporary, many Indian and Filipino moderators consider BPO work a career path unto itself. These jobs, often performed by workers who are educated and English proficient but not particularly well-off, can function as a springboard to the middle class. Without these foreign workers, Facebook would be inundated with the type of violent content that would eventually drive users and advertisers away.

“Given the centrality of content moderation, it’s not natural to outsource it,” says Paul Barrett, deputy director of the Center for Business and Human Rights at New York University. “It makes it somewhat problematic right from the start because the function is that important.”

Rahul, a 30-year-old man in Hyderabad, India, who asked to use a pseudonym to protect his identity, joined Genpact after working in customer services at a bank for several years. The starting salary was good, close to $500 a month (now it’s closer to $200), and he found the opportunity to work on-site at Facebook alluring.

When Rahul arrived at the company’s Hyderabad office, he marveled at its plush couches, Ping-Pong tables, and meals catered by the best restaurants in the city. Rahul hadn’t seen anything like it before. The chance to tell his friends that he worked in Hitech City, a campus for some of the most notable international tech companies, seemed too good to pass up. “It was 60% of why I took the job,” he said. 

AUSTIN, TX - MARCH 5: Content moderators work at a Facebook office in Austin, Texas. (Photo by Ilana Panich-Linsman for The Washington Post via Getty Images)
Ilana Panich-Linsman for The Washington Post via Getty Images

Rafael, a Filipino man who spoke to Rest of World on the condition that he be identified by a pseudonym to protect his identity, said that he took on content moderation work to advance his career. A former journalism student who enjoyed writing, he was happy to exercise his creativity by writing blog posts for some of his firm’s other clients, in addition to his moderation work.

Even when the work proved trying, he focused on the potential for professional growth. “To enrich my portfolio, I stuck with it for some time,” Rafael said. He also noted that as soon as he saw an opportunity to leave content moderation and take a new job, two years into the position, he “jumped on it.”

Daniel Charest, a Dallas-based attorney, represented some of the moderators in the U.S. suit. His success, he said, stemmed largely from the ability to discuss the psychological harm done by the work to his clients, who were diagnosed with PTSD, depression, and anxiety by psychological professionals, in court. But in countries like India and the Philippines, social stigmas and a lack of mental health access often stand in the way of speaking openly about psychological issues, not to mention getting diagnoses in the first place. 

Rafael felt comfortable and even confident moderating everything from hate speech to pornography. But it was a graphic video of a child being abused that stuck with him. After seeing the video, he began to notice a change in his own behavior that worried him. “I am not a bad person,” he told Rest of World. “But I’d find myself doing little diabolical things, saying things I would regret. Thinking things I didn’t want to.”

At the time, the company he worked for did not offer its moderators any sort of in-house counseling. So he paid for therapy, using his own money. When asked if he believed companies like Facebook should be responsible for bearing these types of costs on behalf of moderators, Rafael said he had signed a nondisclosure agreement with his current company, preventing him from answering such a question.

“Sometimes the despair and darkness of people will get to you, even if you say, ‘Oh, I’m a professional, I can handle that,’” he said. 

In his first few weeks on the job, Rahul felt shocked by the graphic videos he encountered of car crashes and child abuse. Eventually, he grew desensitized. 

“It gets to a point where you can eat your lunch while watching a video of someone dying. … But at the end of the day, you still have to be human.” Rahul said he didn’t see a therapist — it wouldn’t have been useful to him, he said.

Rahul’s experience of desensitization mirrors that of other moderators, one former counselor who worked with Genpact and requested anonymity, told Rest of World. When moderators finally did seek treatment — often when content would remind them of events in their personal lives — they were surprised to be affected. According to the former counselor, “They’d say, ‘I’ve been doing this for so long. Why am I feeling this now?’”

According to Dr. K. Jyothirmayi, a Hyderabad-based psychiatrist, stigmas, such as the perceived impact of a mental health diagnosis on one’s marital prospects, often prevent young Indians from seeking treatment. 

The Philippines has comparatively progressive mental health legislation, but barriers and stigmas persist. 

Issues ranging from Manila’s notorious traffic to the prohibitive costs of mental health care, may make treatment unattainable to the average Filipino BPO worker, according to Lawrence Wang, founder of MyGolana, Inc., a U.S.-based online counseling startup that works in the Philippines.

While Rafael had to seek private counseling, some BPO firms do offer this service in-house in the Philippines, though even that can be outsourced. 

Mariel, who also asked that she be identified by a pseudonym because she is bound by a nondisclosure agreement, worked as a hotline counselor at a BPO company. Her clients were employees at other BPO companies, including content moderators for Facebook. Because the sessions took place over the phone, Mariel was not able to offer psychological diagnoses.

Over the course of her two years with the company, Mariel had several clients who began to contemplate suicide after moderating suicide-related content. Callers who expressed suicidal thoughts or tendencies were flagged to the firm handling content moderation and referred to an in-person counselor. But, she said, there were no such protocols for moderators experiencing other issues, such as anxiety, depression, or sleep deprivation, that might alert the company to brewing mental health crises.

A man uses Facebook on a mobile phone in Yangon on June 7, 2018. - Facebook has blacklisted a group of Myanmar Buddhist hardliners including monks notorious for bilious hate speech against Rohingya Muslims, the company said June 7, as it scrambles to show it is tackling inflammatory content. (Photo by Ye Aung THU / AFP)        (Photo credit should read YE AUNG THU/AFP via Getty Images)
YE AUNG THU/AFP via Getty Images

“The deep seething thought will haunt you; it will creep you out,” Rafael said. “It’s impossible that it wouldn’t, because I saw it; it’s already recorded in my mind.” Rafael attributes his ability to recover from seeing the video to therapy.

“I feel like, if I had not consulted a specialist back then, it [the feeling of being haunted by the video] would have manifested in some form that I would have regretted later on.”

Even if moderators did receive in-person counseling, they would run into inhospitable legal climates. 

In India, workers’ compensation lawsuits are adjudicated through various laws, but the country’s legal framework does not consider mental health an occupational hazard. “Indian labor law frameworks were designed with factories in mind,” Ayush Rathi, a researcher with India’s Centre for Internet & Society, told Rest of World.

Geetha Devarajan, a labor lawyer and consultant to the Forum for IT Employees, a trade union for IT and BPO workers, said Indian content moderators might file a lawsuit under the country’s Industrial Disputes Act of 1947. But before filing a lawsuit under the act, moderators would have to be unionized and enter arbitration with their employers in order to try to reach an acceptable compromise.

Though labor laws in the Philippines generally state that third-party companies share responsibility for employees working for a contracting firm, those regulations do not apply to the BPO industry — which allows third-party companies that outsource work to BPO firms to avoid responsibility for the employees who do that work. 

“The entire legal system in the Philippines is designed to attract foreign investment into industries like business process outsourcing,” said Alden Sajor Marte-Wood, an assistant professor at Rice University researching content moderation work in the Philippines. “Part of that means that there aren’t the same kind of labor protections as in the U.S.”

An abundance of cheap labor and high unemployment in both markets means that BPO workers are also made to feel disposable. 

“To keep people in check, companies will say that, if workers don’t follow, don’t meet their client’s goals, then the client will pull out and take the work to another country,” Mylene Cabalona, the president of the BPO Industry Employees Network (BIEN), which promotes BPO workers’ rights in the Philippines, told Rest of World. “And of course, people are scared to lose their jobs.”

Rohan Seth, a policy analyst at The Takshashila Institution, a think tank, added, “These may not be the best-paying jobs, but having something is better than nothing. … It’s almost a reason not to organize.”

Since the lawsuit, Facebook has agreed to changes in order to better support content moderators, including requiring outsourcing firms to offer more psychological support. A representative from Facebook told Rest of World that these changes will be rolled out for moderators working outside the U.S. as well. However, the lawsuit makes no compensation to those international moderators who have already suffered psychological harm.

Marte-Wood worries that the threat of future lawsuits in the U.S. may push moderation work to places like India and the Philippines, where labor laws are thinner and similar lawsuits less likely. 

For NYU’s Paul Barrett, the only solution is to bring content moderators in-house, though he acknowledges it’s still cheaper for Facebook to employ contractors. “I think Facebook knows it has to employ a large number of Americans who understand idiomatic English and understand U.S. customs,” he said. “And $52 million is not that expensive to be squared away with more than 10,000 moderators.” 

Like our stories? Follow Rest of World on FacebookTwitter (@RestofWorld) and Instagram, and let us know how we’re doing.