On the face of it, you might think that the QAnon conspiracy has largely disappeared from big social media sites. But that’s not quite the case.
True, you’re much less likely to find popular QAnon catchphrases like ‘great awakening,’ ‘the storm’ or ‘trust the plan’ on Facebook these days. Facebook and Twitter have removed tens of thousands of accounts dedicated to the baseless conspiracy theory, which depicts former President Donald Trump as a hero fighting a secret battle against a sect of devil-worshipping pedophiles who dominate Hollywood, big business, the media and government.
Gone are the huge ‘Stop the Steal’ groups that spread falsehoods about the 2020 US presidential elections. Trump is gone as well, banned from Twitter permanently and suspended from posting on Facebook until 2023.
But QAnon is far from winding down. Federal intelligence officials recently warned that its adherents could commit more violence, like the deadly Capitol insurrection on 6 January. At least one open supporter of QAnon has been elected to Congress. In the four years since someone calling themselves ‘Q’ started posting enigmatic messages on fringe internet discussions boards, QAnon has grown up.
That’s partly because QAnon now encompasses a variety of conspiracy theories, from evangelical or religious angles to alleged pedophilia in Hollywood and the Jeffrey Epstein scandal, said Jared Holt, a resident fellow at the Atlantic Council’s DFRLab who focuses on domestic extremism. “Q-specific stuff is sort of dwindling,” he said. But the worldviews and conspiracy theories that QAnon absorbed are still with us.
Loosely tying these movements together is a general distrust of a powerful, often leftist elite. Among them are purveyors of anti-vaccine falsehoods, adherents of Trump’s ‘Big Lie’ that the 2020 presidential election was stolen and believers in just about any other worldview convinced that a shadowy cabal secretly controls things.
For social platforms, dealing with this faceless, shifting and increasingly popular mindset is a far more complicated challenge than they’ve dealt with in the past.
These ideologies “have cemented their place and now are a part of American folklore,” said Max Rizzuto, another researcher at DFRLab. “I don’t think we’ll ever see it disappear.”
Online, such groups now blend into the background. Where Facebook groups once openly referenced QAnon, you’ll now see others like “Since you missed this in the so called MSM,” a page referencing “mainstream media” that boasts more than 4,000 followers. It features links to clips of Fox News’ Tucker Carlson and to articles from right-wing publications such as Newsmax and the Daily Wire.
Subjects range from allegedly rampant crime to unfounded claims of widespread election fraud and an “outright war on conservatives.” Such groups aim to draw followers in deeper by directing them to further information on less-regulated sites such as Gab or Parler.
When DFRLab analyzed more than 40 million appearances of QAnon catchphrases and related terms on social media this spring, it found that their presence on mainstream platforms had declined significantly in recent months. After peaks in the late summer of 2020 and briefly on 6 January, QAnon catchphrases have largely evaporated from mainstream sites, DFRLab found.
So while your friends and relatives might not be posting wild conspiracies about Hillary Clinton drinking children’s blood, they might instead be repeating debunked claims such as that vaccines can alter your DNA.
There are several reasons for dwindling Q talk — Trump losing the presidential election, for instance, and the lack of new messages from ‘Q.’ But the single biggest factor appears to have been the QAnon crackdown on Facebook and Twitter. Despite well-documented mistakes that revealed spotty enforcement, the banishment largely appears to have worked. It is more difficult to come across blatant QAnon accounts on mainstream social media sites these days, at least from the publicly available data that does not include, for instance, hidden Facebook groups and private messages.
While QAnon groups, pages and core accounts may be gone, many of their supporters remain on the big platforms — only now they’re camouflaging their language and watering down the most extreme tenets of QAnon to make them more palatable.
“There was a very, very explicit effort within the QAnon community to to camouflage their language,” said Angelo Carusone, the president and CEO of Media Matters, a liberal research group that has followed QAnon’s rise. “So they stopped using a lot of the codes, the triggers, the keywords that were eliciting the kinds of enforcement actions against them.”
Other dodges may have also helped. Rather than parroting Q slogans, for instance, for a while earlier this year supporters would type three asterisks next to their name to signal adherence to the conspiracy theory. (That’s a nod to former Trump national security adviser Michael Flynn, a three-star general).
Facebook says it has removed about 3,300 pages, 10,500 groups, 510 events, 18,300 Facebook profiles and 27,300 Instagram accounts for violating its policy against QAnon. “We continue to consult with experts and improve our enforcement in response to how harm evolves, including by recidivist groups,” the company said in a statement.
But the social giant will still cut individuals posting about QAnon slack, citing experts who warn that banning individual Q adherents “may lead to further social isolation and danger,” the company said. Facebook’s policies and response to QAnon continue to evolve. Since last August, the company says it has added dozens of new terms as the movement and its language has evolved.
Twitter, meanwhile, says it has consistently taken action against activity that could lead to offline harm. After the 6 January insurrection, the company began permanently suspending thousands of accounts that it said were “primarily dedicated” to sharing dangerous QAnon material. Twitter said it has suspended 150,000 such accounts to date. Like Facebook, the company says its response is also evolving.
But the crackdown may have come too late. Carusone, for instance, noted that Facebook banned QAnon groups tied to violence six weeks before it banned QAnon more broadly. That effectively gave followers notice to regroup, camouflage and move to different platforms.
“If there were ever a time for a social media company to take a stand on QAnon content, it would have been like months ago, years ago,” DFRLabs’ Rizzuto said.
www.morningtidings.com
Leave a Reply