“The thing I really care about is the mission, making the world open,” said Facebook founder and CEO Mark Zuckerberg eight years ago. “A lot of times, I run a thought experiment, ‘If I were not at Facebook, what would I be doing to make the world more open?’ ”
That was back when the world was young, when new web platforms known as “social media” seemed to augur a world of blissful transparency and happy connectedness. Just occasionally we remember the old optimism, as when people around the world witness the Iran protests as they’re happening on Twitter, Facebook, and newer platforms. But most of us, we suspect, now take a dimmer view of social media.
Facebook in particular has had a tough time in more recent days. Many Democrats, including a number of U.S. senators, have all but blamed the tech company for Donald Trump’s 2016 victory. Facebook and its subsidiary Instagram, the argument goes, left themselves open (that word again!) to armies of Kremlin-connected trolls intent on spreading lies about Hillary Clinton and so helped tip the Electoral College to Trump.
Then in late December we had a far more distressing glimpse of Facebook’s inner workings. “The Worst Job in Tech: Staring at Human Depravity to Keep It Off Facebook,” read the Wall Street Journal headline. The piece explains the daily tasks of some 7,500 “content moderators” or “content reviewers” working for the Menlo Park, Calif.-based tech giant. The job of these moderators is to monitor Facebook posts and block offensive material. “Coping mechanisms among content moderators,” we learn, “included a dark sense of humor and swiveling around in their chairs to commiserate after a particularly disturbing post.”
Most of these moderators are not Facebook employees but contract workers. We can see why. They don’t last long in the job. “Former content moderators recall having to view images of war victims who had been gutted or drowned and child soldiers engaged in killings. One former Facebook moderator reviewed a video of a cat being thrown into a microwave. Workers sometimes quit on their first or second day. Some leave for lunch and never come back. Others remain unsettled by the work—and what they saw as a lack of emotional support or appreciation—long after they quit.”
The biggest problem: There’s so much content to ban. Vile posts appear faster than moderators can keep up, and technology isn’t much help. “Humans, still, are the first line of defense. Facebook, YouTube and other companies are racing to develop algorithms and artificial-intelligence tools, but much of that technology is years away from replacing people.”
Other tech companies—Apple, YouTube, Twitter—hire unknown numbers of content moderators. At Facebook, moderators “can have as many as three face-to-face counseling sessions a year arranged by an employee-assistance program.”
As for Zuckerberg, he says he’s “dead serious” about dealing with the problem of filtering out repellent content—the sinister, racist, disgusting, and otherwise morally debased content human beings are unfortunately capable of producing in vast quantities. We wish him the best of luck. At least take comfort in the fact that he’s made the world more open.