Business

Facebook content moderation is an ugly business. Here’s who does it

Some of the workers saw video of a man being stabbed to death. Others viewed acts of bestiality. Suicides and beheadings popped up too.

The reason for watching the gruesome content: to determine whether it should be pulled from Facebook before more members of the world’s largest social network could see it.

Content moderators protect Facebook’s 2.3 billion users from exposure to humanity’s darkest impulses. Swarming through posts that’ve been flagged by other members of the social network or by the Silicon Valley giant’s artificial intelligence tools, they quickly decide what stays up and what comes down. But reviewing the posts doesn’t come without cost. Constant exposure to violence, hatred and sordid acts can wreak havoc on a person’s mental health. Former content moderators have already filed a lawsuit against Facebook in which they say repeated exposure to violent images caused psychological trauma. There’s a reason being a content moderator has been called “the worst job in technology.”

It’s also an important job, and one that isn’t handled by Facebook employees. Instead, it’s outsourced to contractors, some of whom turn to drugs and sex in the workplace to distract themselves from the abhorrent images they see every day, according to a recent story in The Verge, which reported that some of the workers make as little as $28,800 per year. That’s just over the federal poverty level for a family of four.

Contracting in the tech industry has reached a flashpoint, escalating tensions in Silicon Valley’s world of haves and have-nots. Contractors and temps don’t get the health care or retirement benefits that full-time employees do, a difference that hasn’t gone unnoticed. Last year, contract workers at Google protested, demanding higher wages and benefits.

In a statement, Facebook defended the use of contractors, saying it gives the social network flexibility in where to concentrate its efforts.

“We work with a global network of partners, so we can quickly adjust the focus of our workforce as needed,” a Facebook spokeswoman said in a statement. “For example, it gives us the ability to make sure we have the right language expertise — and can quickly hire in different time zones — as new needs arise or when a situation around the world warrants it.”

Here’s a look at five of the companies that have worked with Facebook to police content.

Cognizant

A multinational provider of services to technology, finance, health care, retail and other companies, Cognizant offers services including app development, consulting, information technology and digital strategy.

Based in Teaneck, New Jersey, Cognizant has roughly 281,600 employees around the world, according to its annual report. Nearly 70 percent of its workforce is in India.

The company’s role in supporting Facebook’s content moderation activities was the subject of the recent story in The Verge, which reported that roughly 1,000 Cognizant employees at its Phoenix office evaluate posts for potentially violating Facebook rules against hate speech, violence and terrorism.

Indian Economy Business Images
Cognizant Technology Solutions office in Chennai, India. The company works with Facebook on content moderation.

Madhu Kapparath/Getty Images

The workers get two 15-minute breaks, a 30-minute lunch and nine minutes of “wellness time” per day. They also have access to counselors and a hotline, according to the report.

Still, some workers said that constant exposure to depravity has taken its toll. One former content moderator said he started to believe conspiracy theories, such as 9/11 being a hoax, after reviewing videos promoting the idea that the terrorist attack was faked. The former employee said he had brought a gun to work because he feared that fired employees would return to the office to harm those who still had jobs.

Cognizant said it looked into “specific workplace issues raised in a recent report,” that it had “previously taken action where necessary” and that it has “steps in place to continue to address these concerns and any others raised by our employees.”

The company outlined the resources it offers employees, including wellness classes, counselors and a 24-hour hotline.

“In order to ensure a safe working environment, we continuously review our workplace offerings, in partnership with our clients, and will continue to make necessary enhancements,” Cognizant said in a statement.

PRO Unlimited

Based in Boca Raton, Florida, PRO Unlimited provides services and software used by clients in more than 90 countries.

Last year, Selena Scola, a former PRO Unlimited employee, who worked as a Facebook content moderator, filed a lawsuit alleging that she suffered from psychological trauma and post-traumatic stress disorder caused by viewing thousands of disturbing images of violence. Scola’s PTSD symptoms can pop up when she hears loud noises or touches a computer mouse, according to the lawsuit.

On Friday, the lawsuit was amended to include two more former content moderators who worked at Facebook through staffing companies.

“Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator,” the lawsuit states, referring to Scola.

Filed in superior court in Northern California’s San Mateo County, the lawsuit alleges Facebook violated California law by creating dangerous working conditions. Facebook content moderators are asked to review more than 10 million posts per week that may violate the social network’s rules, according to the lawsuit, which seeks class-action status.

At the time the original lawsuit was filed, Facebook acknowledged the work can be stressful and said it requires the company it works with for content moderation to provide support such as counseling and relaxation areas.

Facebook in a court filing denied Scola’s allegations and called for the case to be dismissed.

A Facebook spokeswoman said the social media giant no longer uses PRO Unlimited for content moderation. PRO Unlimited didn’t respond to a request for comment.

Accenture

One of the most prestigious consultancies in the world, Dublin-based Accenture has more than 459,000 people serving clients across 40 industries and in more than 120 countries, according to its website.

People enter an Accenture office in down
People enter an Accenture office in downtown Helsinki.

Jussi Nukari/Getty Images

In February, Facebook content reviewers at an Accenture facility in Austin, Texas, complained about a “Big Brother” environment, alleging they weren’t allowed to use their phones at their desk or take “wellness” breaks during the first and last hour of their shift, according to a memo obtained by Business Insider.

“Despite our pride in our work, Content Moderators have a secondary status in [the] hierarchy of the workplace, both within the Facebook and the Accenture structure,” the memo read.

Accenture didn’t respond to a request for comment. At the time, Facebook said there had been a “misunderstanding” and that content moderators are encouraged to take wellness breaks at any time throughout the day.

Some of Accenture’s clients have included other tech giants such as Google, Microsoft and Amazon. More than three-quarters of Fortune Global 500 companies work with Accenture.

Arvato

One of Facebook’s largest content moderation centers is in Germany, a country that started enforcing a strict hate speech law last year that would fine social media companies up to 50 million euros ($58 million) if they didn’t pull down hate speech and other offensive content quickly enough.

Arvato, owned by the German media company Bertelsmann, runs a content moderation center in Berlin. The company has faced complaints about working conditions and the toll the job takes on workers’ mental health.

In 2017, Arvato said in a statement that it takes the well-being of its employees seriously and provides health care and access to company doctors, psychologists and social services.

Some of the workers saw video of a man being stabbed to death. Others viewed acts of bestiality. Suicides and beheadings popped up too.

The reason for watching the gruesome content: to determine whether it should be pulled from Facebook before more members of the world’s largest social network could see it.

Content moderators protect Facebook’s 2.3 billion users from exposure to humanity’s darkest impulses. Swarming through posts that’ve been flagged by other members of the social network or by the Silicon Valley giant’s artificial intelligence tools, they quickly decide what stays up and what comes down. But reviewing the posts doesn’t come without cost. Constant exposure to violence, hatred and sordid acts can wreak havoc on a person’s mental health. Former content moderators have already filed a lawsuit against Facebook in which they say repeated exposure to violent images caused psychological trauma. There’s a reason being a content moderator has been called “the worst job in technology.”

It’s also an important job, and one that isn’t handled by Facebook employees. Instead, it’s outsourced to contractors, some of whom turn to drugs and sex in the workplace to distract themselves from the abhorrent images they see every day, according to a recent story in The Verge, which reported that some of the workers make as little as $28,800 per year. That’s just over the federal poverty level for a family of four.

Contracting in the tech industry has reached a flashpoint, escalating tensions in Silicon Valley’s world of haves and have-nots. Contractors and temps don’t get the health care or retirement benefits that full-time employees do, a difference that hasn’t gone unnoticed. Last year, contract workers at Google protested, demanding higher wages and benefits.

In a statement, Facebook defended the use of contractors, saying it gives the social network flexibility in where to concentrate its efforts.

“We work with a global network of partners, so we can quickly adjust the focus of our workforce as needed,” a Facebook spokeswoman said in a statement. “For example, it gives us the ability to make sure we have the right language expertise — and can quickly hire in different time zones — as new needs arise or when a situation around the world warrants it.”

Here’s a look at five of the companies that have worked with Facebook to police content.

Cognizant

A multinational provider of services to technology, finance, health care, retail and other companies, Cognizant offers services including app development, consulting, information technology and digital strategy.

Based in Teaneck, New Jersey, Cognizant has roughly 281,600 employees around the world, according to its annual report. Nearly 70 percent of its workforce is in India.

The company’s role in supporting Facebook’s content moderation activities was the subject of the recent story in The Verge, which reported that roughly 1,000 Cognizant employees at its Phoenix office evaluate posts for potentially violating Facebook rules against hate speech, violence and terrorism.

Indian Economy Business Images
Cognizant Technology Solutions office in Chennai, India. The company works with Facebook on content moderation.

Madhu Kapparath/Getty Images

The workers get two 15-minute breaks, a 30-minute lunch and nine minutes of “wellness time” per day. They also have access to counselors and a hotline, according to the report.

Still, some workers said that constant exposure to depravity has taken its toll. One former content moderator said he started to believe conspiracy theories, such as 9/11 being a hoax, after reviewing videos promoting the idea that the terrorist attack was faked. The former employee said he had brought a gun to work because he feared that fired employees would return to the office to harm those who still had jobs.

Cognizant said it looked into “specific workplace issues raised in a recent report,” that it had “previously taken action where necessary” and that it has “steps in place to continue to address these concerns and any others raised by our employees.”

The company outlined the resources it offers employees, including wellness classes, counselors and a 24-hour hotline.

“In order to ensure a safe working environment, we continuously review our workplace offerings, in partnership with our clients, and will continue to make necessary enhancements,” Cognizant said in a statement.

PRO Unlimited

Based in Boca Raton, Florida, PRO Unlimited provides services and software used by clients in more than 90 countries.

Last year, Selena Scola, a former PRO Unlimited employee, who worked as a Facebook content moderator, filed a lawsuit alleging that she suffered from psychological trauma and post-traumatic stress disorder caused by viewing thousands of disturbing images of violence. Scola’s PTSD symptoms can pop up when she hears loud noises or touches a computer mouse, according to the lawsuit.

On Friday, the lawsuit was amended to include two more former content moderators who worked at Facebook through staffing companies.

“Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator,” the lawsuit states, referring to Scola.

Filed in superior court in Northern California’s San Mateo County, the lawsuit alleges Facebook violated California law by creating dangerous working conditions. Facebook content moderators are asked to review more than 10 million posts per week that may violate the social network’s rules, according to the lawsuit, which seeks class-action status.

At the time the original lawsuit was filed, Facebook acknowledged the work can be stressful and said it requires the company it works with for content moderation to provide support such as counseling and relaxation areas.

Facebook in a court filing denied Scola’s allegations and called for the case to be dismissed.

A Facebook spokeswoman said the social media giant no longer uses PRO Unlimited for content moderation. PRO Unlimited didn’t respond to a request for comment.

Accenture

One of the most prestigious consultancies in the world, Dublin-based Accenture has more than 459,000 people serving clients across 40 industries and in more than 120 countries, according to its website.

People enter an Accenture office in down
People enter an Accenture office in downtown Helsinki.

Jussi Nukari/Getty Images

In February, Facebook content reviewers at an Accenture facility in Austin, Texas, complained about a “Big Brother” environment, alleging they weren’t allowed to use their phones at their desk or take “wellness” breaks during the first and last hour of their shift, according to a memo obtained by Business Insider.

“Despite our pride in our work, Content Moderators have a secondary status in [the] hierarchy of the workplace, both within the Facebook and the Accenture structure,” the memo read.

Accenture didn’t respond to a request for comment. At the time, Facebook said there had been a “misunderstanding” and that content moderators are encouraged to take wellness breaks at any time throughout the day.

Some of Accenture’s clients have included other tech giants such as Google, Microsoft and Amazon. More than three-quarters of Fortune Global 500 companies work with Accenture.

Arvato

One of Facebook’s largest content moderation centers is in Germany, a country that started enforcing a strict hate speech law last year that would fine social media companies up to 50 million euros ($58 million) if they didn’t pull down hate speech and other offensive content quickly enough.

Arvato, owned by the German media company Bertelsmann, runs a content moderation center in Berlin. The company has faced complaints about working conditions and the toll the job takes on workers’ mental health.

In 2017, Arvato said in a statement that it takes the well-being of its employees seriously and provides health care and access to company doctors, psychologists and social services.

[“source=cnet”]

Related Articles

Check Also
Close
Back to top button