On Easter Day, a photo appeared before me as I scrolled through my feed, the kind they call a “thumb-stopper” because it pauses the stream. Perhaps you’ve seen it, or one like it—lately there have been many. A frame from a video, streamed live but cached for viewing, recorded in Cleveland: a man (elderly, black, wearing a cap and wire-rimmed glasses, holding a plastic bag) with someone’s arm, leading directly from the camera, holding a pistol to his head.
Google it if you want to. You can still find it, even though Facebook took it down.
“It is estimated that the number of content moderators scrubbing the world’s social media sites and mobile apps is over 150,000,” reads the opening title card of The Moderators, a documentary short released this April by investigative documentary firm Field of Vision. In a dull office in India, where traffic noise rises over the hum of fans, a small group of young workers has gathered for weeklong training. They’re here, a supervisor tells them, to ensure that images in violation of content policies do not find their way to users’ screens. They’ll be doing this work by hand.
The video of Robert Godwin Sr.’s slaying remained on Facebook for over two hours before it flagged as “offensive content” and taken down. On May 3, Mark Zuckerberg announced Facebook would hire 3,000 content moderators in addition to the 4,500 they already employ. While we’ve been demanding user-facing moderation tools (e.g., features designed to help victims of harassment control what “content” can find its way to their feed), the content moderation industry has flourished in the backend. In service of a polished product, more people, not less, are guaranteed exposure to things they’ll wish they could un-see.
“I want experience,” says one young woman. She’s been looking for work for five months and has just been made aware of what the gig entails. “How much, I don’t know. But I want experience.”
Directed by Adrian Chen and Ciarán Cassidy, The Moderators is an outgrowth of a story Chen reported for Wired in 2014. Accompanied by photographer Moises Saman, Chen focused his interviews for the story on independent contractors in the Philippines and U.S. (Microsoft, Google, and Facebook all rejected Chen’s access requests.) The contractors appearing in the film work for an Indian dating site.
Moderators work to address what Chen calls the “Grandma Problem”: that late-adopters will abandon the platform if they find ISIS propaganda nestled between baby photos and birthday invites. In the film, the trainees’ giggles are caught in their throats when the slideshow changes and a twerking woman becomes a head severed by train-tracks. (Both photos were deemed offensive.) One moderator, who worked outside Manila, quit after eight years, admitting in an interview what disturbed her most—as if naming a genre: “Bestiality with children.”
It’s uncomfortable to think of such things even in the abstract. And yet, thousands of people contracted by the most powerful organizations today are not allowed to avert their eyes from the actual evidence, accepting an undue burden so that the rest of us can socialize free of it entirely. Chen quotes an academic studying content moderation who believes there’s a “tacit campaign” to disguise the existence of content moderators, in order to make the web somehow appear naturally clean. She says, “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”
And magic, unexpectedly, is the crux of it. Like the Wizard of Oz, the technocrats’ mandate-to-rule is secured through the incomprehensibility of their power. In A General Theory of Magic, the anthropologist Marcel Mauss lists among his criteria for the existence of magic a group’s consensus that magic exists and that it holds power over them—not, notably, that its power is understood. Substitute the word “algorithm” with “spell” and see how little changes.
But the reality is that today’s algorithms monetized by Silicon Valley aren’t incomprehensible; they aren’t even really that complicated in theory. The most intricate are basically networks of linear equations, while the truly revolutionary developments are in the hardware that facilitates processing power. Silicon Valley learned from banks that if you make these things appear more complex than they actually are, people will quickly give up trying to understand them, and then you can really get away with a lot. (You may recall, Zuckerberg’s mentor Peter Thiel made his fortune prior to Palantir bridging this very gap at PayPal.) Better yet, if you can get people to believe your proprietary software handles work that’s actually outsourced to Manila.
The Moderators’ shock value depends not on the fact that such jobs exist, but that you probably hadn’t really thought about it. It shows how the primacy of the algorithm leads to a “disruption” of the labor force itself. Riding on the back of neoliberalism, the gig economy represents a caste-like system in which the worker is valued even beneath a bit of code. And yet, while the technocrats replace or obfuscate human labor with automated and algorithmic processes, this displaced labor force remains responsible for funding, through taxes, the pre-existing superstructure. Bill Gates, to his credit, argues this case for the Robot Tax.
Technofeudalism is a relatively new term for an economic condition in which the means of production has shifted from ownership of land to intellectual property. As jobs like those of moderators become more prominent and blue-collar work increasingly involves coding and development, human labor is directed toward reinforcing the supremacy of technocratic elites by enhancing the software that separates them. This distancing allows for the exploitation of the labor of development, as the actual work that goes into building technological systems is obfuscated by the drive toward (or belief in) software as essentially sentient—or, in the case of the singularity, omniscient. The adoption of algorithms and other data-centric means of production by other industries transforms them into fiefs of Silicon Valley.
The Moderators shows what wide-eyed entry into such servitude to the machine looks like, and the importance of this awareness is compounded by rumors of Valley royalty, like Zuckerberg’s desire to run for public office—under which we might imagine global technofeudalism would be offered as antidote to the protectionist policies of the current administration. They built their platforms for user-generated content, we bought in, and now they get to decide what’s worth seeing, who gets to see what, and what we sacrifice in the name of comfort.
“Don’t take it personally,” says the supervisor, who views his gig with moral imperative. “You are here to moderate this stuff so that no one else can see.” But when Facebook completes its appropriation of traditional media, where will we find the evidence of our suffering?