YouTuber and podcaster Seth Andrews recently received a 30-day suspension from Facebook for the crime of… calling out the bigotry of a Christian hate-preacher. And since I accidentally played a role in his banishment, it seems worth chiming on what a horrible move this is on the part of Facebook (and its parent company Meta).
This stems from a tweet I made last week, announcing that the church of Jonathan Shelley, a New Independent Fundamentalist Baptist preacher, was getting evicted from its Hurst, Texas headquarters due to his anti-LGBTQ rhetoric.
Shelley, for those who are blissfully unaware, is a preacher who has previously said of gay people, “I hope they all die!” He has also argued against anti-discrimination laws, saying they only exist because “you’re gross and disgusting and no one likes you.” More damningly, he says the government ought to execute LGBTQ+ people.
No wonder he’s finally getting evicted. It’s not because of his religion. This isn’t faith-based persecution. The eviction is occurring because Shelley uses his church to incite violence and hate against LGBTQ+ people.
That’s the tweet that Seth Andrews shared on The Thinking Atheist Facebook page last week, noting that Shelley has repeatedly said “homosexuals deserve to be executed.” Perhaps because the Facebook algorithm only saw those words—absent the surrounding context—that it levied a seven-day ban on Andrews. He challenged it… and it was subsequently reversed by one of their moderators.
That’s what should always happen. An algorithm may be necessary to remove hate speech quickly, but a human must be able to step in to reverse wrongful takedowns. Especially when, in this case, Andrews is quoting the hate-preacher in an effort to criticize him. It shouldn’t be that hard for one of the wealthiest companies in the world.
And yet, as Andrews explains in the following video, even after that initial reversal, he was almost immediately issued another ban for 30 days… for the exact same post. It’s as if the algorithm had no memory of what happened the previous day.
Andrews could soon lose all access to Facebook, through no fault of his own, all because of this egregiously wrong decision.
The short-term solution here is obvious: Let’s hope someone who works at Facebook notices this and fixes the problem.
But the long-term solution is more pressing: There needs to be a way to distinguish actual hate speech—the sort of thing preachers like Shelley spew multiple times a week—from criticism of that very same hate speech. No one should be punished for calling out religious bigotry, and it’s incredibly hard to call out that bigotry without directly quoting the preachers.
As it stands, Facebook’s algorithm is stifling religious criticism by confusing it with the very hate speech it’s trying to eradicate. (While the site itself has plenty of problems, as Andrews notes in the video, it’s also an incredibly useful tool with which to communicate and spread ideas. So simply not using it, as some have suggested, really isn’t a helpful solution here).
There are always going to be gray areas when it comes to moderation on social media. We’ve been having that very discussion about our OnlySky comment threads! But this is not a gray area.
Andrews used Facebook to draw attention to bad behavior from someone acting in the name of Jesus. And Facebook punished him for it.
It didn’t happen on purpose, I’m sure, but unless the company fixes the problem, their algorithms give too much leverage to the people spreading hate speech and not enough to those calling it out and condemning it.
Whose side is Facebook on?