Facebook is taking a giant step in keeping dangerous QAnon content off its platforms.
On Tuesday, Facebook announced it’s officially banning all Pages, Groups, and Instagram accounts. As described it, the policy update is “one of the broadest rules the social media giant has put in place in its history.”
Facebook has previously taken action against QAnon. Back in May, it a network of Pages, Groups, and accounts that pushed the conspiracy. However, the company said it was removing them because they involved fake accounts and engagements, which are against its rules — not because they were spreading dangerous content.
Facebook removed one of the QAnon-related groups on its platform months later in August under its policies banning misinformation, harassment, and hate speech. Just days after that action, a Facebook internal investigation laying bare just how bad the platform’s QAnon problem was: millions of the site’s users were joining groups supporting the conspiracy theory.
On Aug. 19, Facebook it was cracking down on QAnon pages and groups that discussed or promoted “potential violence.” However, Facebook stopped short of totally banning QAnon. The company said at the time that the conspiracy theory didn’t meet the “rigorous criteria” of its policies.
Fast-forward to today: Facebook announced an update to its policies and declared a sweeping ban on QAnon.
“While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real-world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted the attention of local officials from fighting the fires and protecting the public,” Facebook said in a statement announcing the policy change.
However, will this broad ban on all QAnon content even be enough? Can Facebook keep up with the ever-evolving conspiracy theory?
“Q has specifically asked QAnon followers to ‘deploy camouflage’ by dropping all references to ‘Q’ and ‘QAnon,'” Travis View, co-host of the popular podcast, which tracks the conspiracy theory with a critical view, said in a direct message on Twitter. “Instead QAnon followers have been replacing Q with ’17,’ ‘Cue Anon,’ or ‘Save The Children.'”
Three years ago yesterday, President Donald Trump proclaimed to the press that it was “the calm before the storm” during a White House dinner with top U.S. military officials. This line laid the foundation for what would later be known as QAnon.
A few weeks after that dinner, a user known as “Q” began posting on the imageboard 4chan, claiming they were a government official with top security credentials. This anonymous entity’s posts have led followers of the conspiracy into believing that Trump has been waging a secret war against a global satanic pedophile ring run by a cabal of Hollywood elites and the members of the Democratic Party.
What individual QAnon followers believe is all over the place. Some QAnon believers think that John F. Kennedy Jr. is still alive and will replace Mike Pence as vice president on the Republican ticket any day now. Others think some alleged celebrity suspects in the cabal have already been executed, with clones now walking in their place.
The QAnon conspiracy has evolved further due to the pandemic lockdowns. Coronavirus deniers, anti-maskers, anti-vaxxers … all of these groups have folded into the broader QAnon conspiracy theory in one way or another. As Travis said, the QAnon believers focused on Trump’s political enemies being involved in child sex trafficking have been especially visible, holding real-world events under the guise of the rallying cry “Save Our Children.”
“If Facebook is seeking guidance from knowledgeable online extremism researchers, and I assume that they have, then they should be able to quickly detect the most common attempts to disguise affiliation with QAnon,” said Travis.
Facebook seems to understand this aspect of QAnon, at least based on what the company wrote in its statement.
“QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another,” said the company. “We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.”
The company pointed to “Save Our Children” as an example, without explicitly mentioning it. Facebook says it “began directing people to credible child safety resources when they search for certain child safety hashtags last week,” knowing that QAnon believers use the issue to recruit.
Keeping QAnon off an entire social media platform has actually before. However, unlike other sites, QAnon believers undoubtedly know how key Facebook, specifically, has been to its spread. They’ll surely try to find a workaround. Facebook’s enforcement will be key here in keeping the conspiracy theory off its platform.