Facebook’s rationale for booting Alex Jones and InfoWars: you can lie, but you can’t be hateful
Facebook has joined Apple, Youtube, and Disqus in banning Alex Jones and yanking his InfoWars pages. Jones and his supporters are livid, claiming free speech violations, while liberals are wondering what took so long. The challenge here is that lying on Facebook is perfectly fine; only “hate speech” will get you banned.
Let’s take a close look at Facebook’s statement on why it banned these pages, and how it will apply its policies in the future. In the excerpts that follow, the commentary is mine.
August 6, 2018
Enforcing Our Community Standards
We believe in giving people a voice, but we also want everyone using Facebook to feel safe. It’s why we have Community Standards and remove anything that violates them, including hate speech that attacks or dehumanizes others. Earlier today, we removed four Pages belonging to Alex Jones for repeatedly posting content over the past several days that breaks those Community Standards.
Commentary: The issue here is a balance between free speech and preventing the use of Facebook to promote hate speech based on race, gender identity, or the like. The opening clearly lays this out and describes the impact. The statement is written in the first person, with “we.” This makes it clear and accessible, as opposed to typical passive corporate statements that are impersonal and distance companies from their policies. The title isn’t clear enough, though: it should have been “Why we banned Alex Jones and InfoWars.”
Here’s more detail on enforcement of our standards:
How do you deal with people and Pages who repeatedly violate your standards?
Simply removing content that violates our standards is not enough to deter repeat offenders. It’s why every time we remove something, it counts as a strike against the person who posted it. And when it comes to Pages, we hold both the entire Page and the person who posted the content accountable. Here’s a step-by-step overview of what happens when content is reported to Facebook:
- If a Page posts content that violates our Community Standards, the Page and the Page admin responsible for posting the content receive a strike.
- When a Page surpasses a certain threshold of strikes, the whole Page is unpublished.
- For people, including Page admins, the effects of a strike vary depending on the severity of the violation and a person’s history on Facebook. For example, some content is so bad that posting it just once means we would remove the account immediately. In the case of other violations, we may warn someone the first time they break our Community Standards. If they continue, we may temporarily block their account, which restricts their ability to post on Facebook, or remove it all together. . . .
Commentary: Facebook is smart to describe this as enforcement of a clear policy rather than focusing specifically on Alex Jones.
How do you distinguish between fake news and content that breaks your Community Standards?
People can say things on Facebook that are wrong or untrue, but we work to limit the distribution of inaccurate information. We partner with third-party fact checkers to review and rate the accuracy of articles on Facebook. When something is rated as false, those stories are ranked significantly lower in News Feed, cutting future views by more than 80%.
When it comes to our Community Standards, they’re focused on keeping people safe. If you post something that goes against our standards, which cover things like hate speech that attacks or dehumanizes others, we will remove it from Facebook.
Commentary: This is clear, although you may not agree with it. Lying on Facebook is acceptable; the penalty for fake news is simply that it spreads less quickly. Hate speech is tough to judge. The American Bar Association defines it this way: “Hate speech is speech that offends, threatens, or insults groups, based on race, color, religion, national origin, sexual orientation, disability, or other traits.” It depends on denigrating a group. So if I say that Tim Cook is stupid, that’s not hate speech, but if I say that gay people are stupid, it is. Applying this distinction creates a grey area where hateful speech is fine until it’s racist or sexist or homophobic, and where personal attacks are harder to prosecute. It will also encourage haters to use more code words and dog whistle insinuations to attempt to escape enforcement.
So what happened with InfoWars? They were up on Friday and now they are down?
As a result of reports we received, last week, we removed four videos on four Facebook Pages for violating our hate speech and bullying policies. These pages were the Alex Jones Channel Page, the Alex Jones Page, the InfoWars Page and the Infowars Nightly News Page. In addition, one of the admins of these Pages – Alex Jones – was placed in a 30-day block for his role in posting violating content to these Pages.
Since then, more content from the same Pages has been reported to us — upon review, we have taken it down for glorifying violence, which violates our graphic violence policy, and using dehumanizing language to describe people who are transgender, Muslims and immigrants, which violates our hate speech policies.
All four Pages have been unpublished for repeated violations of Community Standards and accumulating too many strikes. While much of the discussion around Infowars has been related to false news, which is a serious issue that we are working to address by demoting links marked wrong by fact checkers and suggesting additional content, none of the violations that spurred today’s removals were related to this.
Free speech vs. hate speech
Let’s start with the First Amendment and free speech issues here. Because of the safe harbor provisions of copyright law, Facebook is not legally responsible for what appears on its site. (You can’t sue AT&T for someone who sends a murderous threat using text messages on its network; similarly, you can’t sue Facebook for hosting threats.) But Facebook is allowed to determine whatever policies it pleases regarding posts and pages on its site and app. In the same way that your landlord can evict you for having pets or loud parties, Facebook can evict you for whatever violates its policies.
This is not a First Amendment issue. No one is preventing Alex Jones from speaking or writing or publishing his own newsletter. Facebook is under no obligation to host what he says, any more than the New York Times would be responsible for refusing to publish an op-ed that he submitted.
So the real question is, “Are Facebook’s policies sound, and are they enforcing them appropriately?”
I personally find Facebook’s policies odd. For example, you can get banned for showing, horrors, a female nipple. I also think Facebook could more effectively enforce policies that prevent the spread of fake news. But Facebook has chosen to back off of banning lies, which is a challenge, while concentrating on banning threats of violence and hate speech. This is at least consistent.
In contrast to Facebook’s policies, which are transparent, its enforcement is opaque. It will not say how many complaints must happen before a page is banned. It’s not clear why InfoWars is suddenly gone, and wasn’t gone six months or a year ago. (Was it any less hateful back then?) There are plenty of other hateful pages; are they all going to be banned? Alex Jones questions, for example, why anti-Semitic statements on Louis Farrakhan’s Nation of Islam page haven’t caused Facebook to ban it.
Silicon Valley platforms have become a tool for hate. Algorithms have difficulty recognizing and banning hate; it takes humans to do that, which is inefficient and subject to accusations of bias. The benign, algorithmic machine isn’t working so well. It’s going to be a long, hate-filled struggle before the leaders of these platforms can once again establish the safety of their platforms.
“This is not a First Amendment issue. No one is preventing Alex Jones from speaking or writing or publishing his own newsletter. Facebook is under no obligation to host what he says, any more than the New York Times would be responsible for refusing to publish an op-ed that he submitted.”
This is what you say the definition of hate speech is: “Hate speech is speech that offends, threatens, or insults groups, based on race, color, religion, national origin, sexual orientation, disability, or other traits.” What about speech that denigrates women?
I could not find an ABA definition of “hate speech.” I suspect that is because the ABA has not defined the undefinable and legally void phrase. Hate speech is, of course, fully protected under the Constitution.
Facebook, of course, does not fall under the Constitution and can do what they want. It would be nice to be fair and it would be better to be invisible, but that is their business decision. I would expect another service to come along and make FB vanish.
The only person who can make a good decision about what you want to see is you. If you think critically, you will do just fine and you’ll ignore hate speech (and lies) and make it meaningless and you’ll learn some things that don’t neatly fit into your current being. You’ll grow as you learn.
In a moment, I would sign up for a platform that was free and open.
Josh, if you grabbed the definition from this page, https://www.americanbar.org/groups/public_education/initiatives_awards/students_in_action/debate_hate.html, that is not an ABA definition. That’s a project with a definition for the project, in my opinion.