The filth factory: Why social network content moderation is doomed to fail

Image: The Verge

A battle is waging between humanity’s evil side and the cleverness of Silicon Valley social network engineers. The evil side is winning. And it will always win.

The Verge published some insightful articles on this topic in the last few weeks. Casey Newton published the second in a series of exposés on the outsourced operations that moderate content for Facebook. It shocks the conscience, just as his first one did. Here are a few tidbits.

Marcus was made to moderate Facebook content — an additional responsibility he says he was not prepared for. A military veteran, he had become desensitized to seeing violence against people, he told me. But on his second day of moderation duty, he had to watch a video of a man slaughtering puppies with a baseball bat. Marcus went home on his lunch break, held his dog in his arms, and cried. I should quit, he thought to himself, but I know there’s people at the site that need me. He ultimately stayed for a little over a year. . . .

Michelle Bennetti and Melynda Johnson both began working at the Tampa site in June 2018. They told me that the daily difficulty of moderating content, combined with a chaotic office environment, made life miserable.

“At first it didn’t bother me — but after a while, it started taking a toll,” Bennetti told me. “I got to feel, like, a cloud — a darkness — over me. I started being depressed. I’m a very happy, outgoing person, and I was [becoming] withdrawn. My anxiety went up. It was hard to get through it every day. It started affecting my home life.”Johnson was particularly disturbed by the site’s sole bathroom, which she regularly found in a state of disrepair. (The company says it has janitors available every shift in Tampa.) In the stalls, signs posted in response to employee misbehavior proliferated. Do not use your feet to flush the toilet. Do not flush more than five toilet seat covers at one time. Do not put any substances, natural or unnatural, on the walls.

“And obviously the signs are there for a reason, because people are doing this,” said Johnson, who worked at the site until March. “Every bit of that building was absolutely disgusting. You’d go in the bathroom and there would be period blood and poop all over the place. It smelled horrendous all the time.”

She added: “It’s a sweatshop in America.” . . .

More than anything else, the contractors described an environment in which they are never allowed to forget how quickly they can be replaced. It is a place where even Keith Utley, who died working alongside them, would receive no workplace memorial — only a passing mention during team huddles in the days after he passed.

Why does this environment even exist? Because people post terrible things on Facebook. To any engineer, the solution is obvious . . . use a machine to do that work. Couldn’t an A.I. recognize what’s offensive?

This is the topic of a Verge article from earlier this week, an interview with UCLA Professor Sarah T. Roberts about what’s in her new book Behind the Screen: Content Moderation in the Shadows of Social Media. Here’s a little of what she told The Verge.

[I]f you talk to actual industry insiders who will speak candidly and who are actually working directly in this area, they will tell you that there is no time that they can envision taking humans entirely out of this loop. And I believe that to be true. If for no other reason than what I just described, we need human intelligence to train the machines right.

[People] are going to try to defeat [any algorithm]. They’re going to try to game it. We can’t possibly imagine all the scenarios that will come online. 

No way out

This is a classic example of asymmetric warfare, like American revolutionaries harassing British Redcoats or Afghan rebels setting roadside bombs to ambush Americans in Humvees. In asymmetric warfare, one side has overwhelming force, but the other side has guile and undisciplined creativity. The objective of the insurgents is not to win, but to wound.

And the insurgents achieve their objective because of their lack of discipline. There is no “front” on which to wage the battle.

In this battle, the insurgents are people hell-bent on evading content restrictions to see if they can place offensive content online. Aligned against them are the engineers and outsourced content moderators at Facebook, Instagram, Twitter, YouTube, and any other social network. And there are millions of these insurgents bent on sneaking awful things into your feed.

Here’s my reasoning.

  • Social networks are unwilling to require pre-moderation — the reviewing of content before it is posted.
  • Social networks allow anyone new to join and create an account for free without proving who they are.
  • The rules for what is offensive are ever-changing — because people are continually coming up with new types of offensive content.
  • The line between satire or parody and offensive content is impossible to define. Trump in a golf cart driving past drowned migrants? Nancy Pelosi slowed down to appear drunk? Which is it?
  • Filth spreads. It attracts gawkers. They share it. If it’s close to the line, it’s clickbait.
  • A.I. cannot recognize “evil.” It can begin to recognize categories of evil, but it can be fooled — especially when there are people continually trying to find new ways to fool it.
  • False positives will continue to be a problem. My friend Olivier Blanchard got suspended for a week for content that was clearly not in violation of Facebook’s policies.
  • When you ask humans to review offensive content all day long, it ruins their mental health. Casey Newton’s articles reflect that reality.

Put this all together and you see that any social network that allows people set up accounts for free, post anything before moderation, and share anything they see is going to be optimized to spread filth.

A.I. can’t fix it. Masses of people in sweatshop conditions can’t fix it. Social media, as currently constructed, is a filth factory. And we’re addicted to it.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

One Comment

  1. Josh, you say that evil is winning, and that it always wins, but I find that hard to reconcile with my personal experience. I am a heavy social media user (FB, Insta, WhatsApp, YouTube – you can probably guess my age from that list) and I really almost never see content that I find seriously offensive. Does that mean offensive content is not out there – no of course not – but my subjective experience far from “evil is winning”.