Estimated read time: 3-4 minutes
This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.
SALT LAKE CITY -- Have you ever wondered who’s keeping Facebook PG-13?
Sure, you might see the occasional party picture that crosses the “tasteful” line, and everyone has that friend who believes each time they drop a four-letter word an angel gets its wings, but for the most part, Facebook is pretty conservative.
So who monitors that? Who goes through the sea of trivial media and brief opinions to make sure no one gets an eyeful of natural breastfeeding, or reads something hateful about one of Facebook's "protected categories"?
As it turns out, there are teams all around the world shifting through millions of images, videos and status updates each week, filtering out the objectionable material for you and your kids, while taking on a heavy burden themselves.
According to Adrien Chen at Gawker, “One moderator only lasted three weeks before he had to quit.”
Among the images he was subjected to daily: "Pedophelia, necrophelia, beheadings, suicides, etc.," he recalled. "I left [because] I value my mental sanity."
That's just one aspect. It seems there are three difficulties every individual signing up to help keep Facebook teenage-friendly must face, and the first one is terrible pay. Chen referred to one person he interviewed who spent several weeks training to filter Facebook content, only to find out he was being paid $1 an hour.
“‘It's humiliating. They are just exploiting the third world,’ Derkaoui complained in a thick French accent over Skype just a few weeks after Facebook filed their record $100 billion IPO,” Chen wrote.
With Apple, HP and Dell receiving heat for the way workers are treated at Foxconn and Microsoft enjoying similar PR because of what was going on at the KYE Factory, Facebook is probably hiding behind the "everyone is doing it" defense. But to workers making maybe $15 a day, if they're willing to put in the overtime, such an argument may be less than compelling from the billion-dollar organizations.
The next two dificulties stem from one issue: content. The guidelines listed for Facebook content checkers aren't always easy to recognize, and some organizations take issue with the rules Facebook chooses to implement.
There are the obvious rules, like, "No obvious sexual activity" or "naked children," but what about "Mothers breastfeeding," or "No maps of Kurdistan"? These guidelines are a little more perplexing and may stir up resentment from lactacian consultants or disoriented people in Turkey.
And even if the employee is an expert at understanding exactly what the regulations mean and has no moral opposition to their position, there is the third and probably most upsetting dificulty: Actually viewing the material they're been asked to filter.
"Each moderator seemed to find a different genre of offensive content especially jarring. One was shaken by videos of animal abuse. For another, it was the racism: 'You had KKK cropping up everywhere.' Another complained of violent videos of 'bad fights, a man beating another,'" Chen wrote.
The New York Times recently released an article discussing how difficult it is for American content firms. The Times noted that inside the U.S., workers make between $8 and $12 an hour, but some might still argue that's too little for what these workers are subjected to.
"You have 20-year-old kids who get hired to do content review, and who get excited because they think they are going to see adult porn,” Hemanshu Nigam, the former chief security officer at MySpace, told the Times. “They have no idea that some of the despicable and illegal images they will see can haunt them for the rest of their lives.”
You can contact Travis at firstname.lastname@example.org