Facebook content moderation is an ugly business. Here’s who does it

Facebook content moderation is an ugly business. Here’s who does it

Some of the workers saw video of a man being stabbed to death. Others viewed acts of bestiality. Suicides and beheadings popped up too. The reason for watching the gruesome content: to determine whether it should be pulled from Facebook before more members of the world’s largest social network could see it. Content moderators protect Facebook’s 2.3 billion users from exposure to humanity’s darkest impulses. Swarming through posts that’ve been flagged by other members of the social network or by the Silicon Valley giant’s artificial intelligence tools, they quickly decide what stays up…

Read More