Don’t Cry for Alex Jones

Alex Jones does not represent some sort of slippery slope for speech on social media. He rather represents a wasteland of repugnant ideas, which are judged not for their political content, but for their decency and sanity. It’s a free country: His cheeks are free to turn molten lava until they are cooled by his own sweat. But Facebook is under no obligation to share any footage of it. Neither is YouTube. Neither is any other private platform that wishes to sequester his theatrics to the undistinguished netherworld of the web where they belong. Facebook, YouTube, and Apple are not the Internet—Jones may still post to his own website, which no authority is trying to take away.

Defend his right to rage, absolutely, up to the point it may be criminally defamatory. But to worry that his expulsion from a content distributor may lead to yours says more of your association with him than it does with social media. Alex Jones, unmistakably, is that extreme—in a class of his own. It is up to the right, in how it demonstrates acceptance, to keep it that way.

Still, some on the right are concerned that these websites could regulate their users capriciously. Senator Ted Cruz expressed such worry in a strange way. “As the poem goes, you know, first they came for Alex Jones. That does not end well,” he said, per the Texas Observer.

National Review’s David French, writing in the New York Times, was thoughtful, by contrast. “Rather than applying objective standards that resonate with American law and American traditions of respect for free speech and the marketplace of ideas, the companies applied subjective standards that are subject to considerable abuse,” argues French. “Apple said it ‘does not tolerate hate speech.’ Facebook accused Mr. Jones of violating policies against ‘glorifying violence’ or using ‘dehumanizing language to describe people who are transgender, Muslims and immigrants.’ YouTube accused Mr. Jones of violating policies against ‘hate speech and harassment.’”

French’s fear is that these phrases are “extraordinarily vague”—and could be used in the same vein to punish Jones and British-Muslim reformer Maajid Nawaz, whom the Southern Poverty Law Center wrongly labeled an “anti-Muslim extremist.” How the SPLC and a social media network apply such shared terminology is not an exact comparison. But there is a broader point: that private entities can’t necessarily be trusted to apply it fairly.

Facebook, for example, which “unpublished” four Jones/Infowars-related pages, has a set of “Community Standards” it uses to regulate users. The four pages violated the standards’ “hate speech and bullying policies,” as well as “glorif[ied] violence, which violates our graphic violence policy, and [used] dehumanizing language to describe people who are transgender, Muslims and immigrants, which violates our hate speech policies,” the company wrote in a blog post on Monday. Facebook defines “hate speech” as “a direct attack on people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability,” as well as immigration status. It separates hate-speech violations into three tiers, which appear to be for organizational purposes as much as denoting severity.

Tier 1 includes: “Any violent speech or support in written or visual form … Dehumanizing speech or imagery including (but not limited to): Reference or comparison to filth, bacteria, disease, or feces; Reference or comparison to animals that are culturally perceived as intellectually or physically inferior; Reference or comparison to subhumanity … Mocking the concept, events or victims of hate crimes even if no real person is depicted in an image … Designated dehumanizing comparisons in both written and visual form.”

Some of these are more specific than others: Referring to people as something approximating untermensch seems vile and clear, whereas calling someone a POS can be, in context, sarcastic or part of a testosterone-fueled debate about the Sunday Night Football game. In situations like this, it is on users to report offenses in good faith, and on Facebook to exercise discretion in the same. The question to each Facebooker then is: Do you trust the process?

It is a process that, apropos French, is somewhat vague. “We don’t want people to game the system, so we do not share the specific number of strikes that leads to a temporary block or permanent suspension,” the Monday blog post from Facebook reads.

“If someone violates our policies multiple times, their account will be temporarily blocked; a Page that does so will be unpublished.”

Understanding that Facebook does not disclose a threshold of strikes for banishment, I asked if they could be more specific about their evaluation of the Jones/Infowars-related pages, and if they had any sort of “bright-line” test or Potter Stewart-like test for disciplinary action. Jones, after all, is such a serial offender and high-profile figure that his removal from the platform would seem both obvious and arbitrary whenever the judgment was rendered.

A Facebook spokesperson merely referred me back to the blog post and the Community Standards guide I read and asked them about.

French, whose recommendation is that these tech companies move on from uncertain standards and instead “prohibit libel or slander on their platforms,” seems to have a point. But even an open-ended reading of Facebook’s policies and the like provide clarity about permissible behavior on these websites: Don’t be like Alex Jones, and you won’t be punished as he was.

Related Content