Well, if yesterday the story was that warehouse robotics would be creating a lot more displaced workers, today’s story is that Facebook is hiring. They’re looking to hire 3,000 people, worldwide, and, while I haven’t seen the job requirements, it doesn’t sound on the face of it as if you need elite, FB-level coding skills to grab one of these jobs. What Facebook wants to bring on board are folks:
…to monitor videos and posts for violent or criminal acts, and potentially prevent tragedies from occurring.
The social-media site has faced calls to do more, and respond faster, after a murder and a suicide were recently shown live. The new employees, who will be added over the next year, will join 4,500 people already on Facebook’s content moderation force.(Source: Bloomberg)
Content moderation sounds relatively passive, but Mark Zuckerberg wants to prevent suicides and murders-in-the-making. And he wants to halt (alt?) the spread of misinformation as well. (Hey, Mark, where were you when we needed you on that front? Better late than never, I guess.)
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote. “We’re working to make these videos easier to report so we can take the right action sooner -- whether that’s responding quickly when someone needs help or taking a post down.”
I wondered about what the life of a content moderator entailed, and when I went to the Google, up popped a 2014 article from Wired with this how-could-you-not-click headline, “The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed,” which got right into just what content moderation’s all about.
It is as unglamorous, and even gackier, than I would have thought.
…Michael Baybayan, an enthusiastic 21-year-old with a jaunty pouf of reddish-brown hair. If the space [where he works] does not resemble a typical startup’s office, the image on Baybayan’s screen does not resemble typical startup work: It appears to show a super-close-up photo of a two-pronged dildo wedged in a vagina. I say appears because I can barely begin to make sense of the image, a baseball-card-sized abstraction of flesh and translucent pink plastic, before he disappears it with a casual flick of his mouse. (Source: Wired)
Can a job get much worse?
Anyway, as of 2014, it was estimated that there were 100,000 content moderators out there, many of them Filipinos, working for companies like Facebook, at wages far lower than what a cafeteria worker at FB headquarters would haul down: $300-$500 per month. Content moderators were “hunting for:pornography, gore, minors, sexual solicitation, sexual body parts/images, racism.” Translation: “dick pics, thong shots, exotic objects inserted into bodies, hateful taunts, and requests for oral sex.”
All the content moderation jobs aren’t outsourced overseas.
Many companies employ a two-tiered moderation system, where the most basic moderation is outsourced abroad while more complex screening, which requires greater cultural familiarity, is done domestically. US-based moderators are much better compensated than their overseas counterparts: A brand-new American moderator for a large tech company in the US can make more in an hour than a veteran Filipino moderator makes in a day.But then a career in the outsourcing industry is something many young Filipinos aspire to, whereas American moderators often fall into the job as a last resort, and burnout is common.
No surprise there. How’d you like to spend your day looking through: “brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents.” It’s no wonder that there are concerns about the psychological impact on workers. Think PTSD.
And now there are going to be another 3,000 of them.
And, of course, with Facebook Live, there’s the urgency overlay, as the people doing all those hideous things are now streaming them in real-time. It’s no longer enough to get rid of some snuff video that was filmed god knows when. Now, those moderators are seeing what’s going down as it happens, and the pressure is on to go into immediate “see something/say something” mode in hopes of stopping the suicidal young woman from live-streaming her death, or the homicidal maniac from killing (as happened recently in Ohio) some random guy coming home from Easter dinner with his grandkids.
The good news is, Facebook’s hiring. The bad news is the jobs themselves. Which, by the way, won’t be around for all that long. Those elite coders? They’re looking for ways to automate content moderation. So those jobs will be going the way of dodo bird work like warehouse pick and pack. And coal mining. What exactly is it that people are going to do when rotten jobs like content moderation disappear?
No comments:
Post a Comment