Facebook is a perfect place for conspiracy theories like QAnon to evolve | Tech Industry
Last week, a notorious conspiracy theory left the dark confines of the internet and entered the real world, appearing as T-shirts and signs in front of TV cameras at a Donald Trump rally. “What is QAnon?” multiple headlines blared.
Those headlines, neatly compiled into a shareable image, quickly made their way into Facebook groups dedicated to the conspiracy theory—as did links to the articles. Commenters were glad to get mainstream media coverage.
Facebook is only one of the platforms where the conspiracy has been spreading. But it’s a particularly powerful one, bringing far-fetched musings into the mainstream. “Facebook is one of those last ports of call before you start seeing things on television,” said Benjamin T. Decker, research fellow at the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School.
What is QAnon?
The basic premise of the QAnon theory is that there’s an anonymous person who has a high-level government security clearance, designated by the letter “Q,” hence the nickname. Q drops “bread crumbs” of information to followers about a vast conspiracy against Donald Trump. In a way, it’s the mother of all conspiracy theories: for example, according to the hints, Democrats are running a child molestation ring, they’re in bed with the Saudi royal family, and they’re also satanists.
One of the reasons why the QAnon theory is different, and so popular, is that it has a “fun” element, said Joseph Uscinski, associate professor of political science at the University of Miami and expert on conspiracy theories. “You have somebody putting out these breadcrumbs and clues—it gives people something to do.” The number 17, for example, is central to the theory, as “q” is the 17th letter of the alphabet. Trump has mentioned the number in offhand remarks and tweets in the past, a fact that followers of the theory have not failed to remark.
As followers of Q start making themselves visible in the real world, concerns about potential violence become more tangible. “Eventually somebody is going to fight fire with fire,” Uscinski said. “If you think there are groups in secret molesting children, somebody is going to take action against them,” he added, referencing the Pizzagate conspiracy theory, which led one man to fire an assault rifle in a Washington DC pizza shop because he was wrongly convinced it was a front for a pedophile ring.
Tricking Facebook users into following a conspiracy theory
Although the exact patterns of how the conspiracy spread are not yet clear, it appears that it started on the message boards 4chan and 8chan. From there, it spread to Reddit and YouTube, finally arriving—with help from a popular mobile app called QDrops—on social media platforms like Facebook. Currently, the most-followed Reddit community dedicated to the theory has nearly 50,000 members, and the largest Facebook group has more than 40,000. More than 10,000 joined the group just in the past month.
But subscribing to the theory on Facebook is notably different from doing so on the less popular sites. The real name policy enforced by the platform means that users are openly admitting to be Q followers, since you can’t hide behind an anonymous username. And the platform is so much bigger than Reddit or 4chan—two-thirds of US adults use Facebook.
Facebook is also diverse in terms of demographics, attracting many older users who can sometimes be intentionally tricked into joining a QAnon group. “These people who haven’t necessarily been raised as digital natives are naively sort of navigating their way through Facebook which is like a giant library with no Dewey Decimal System,” Decker said.
To gain followers, conspiratorial or hyperpartisan actors commonly establish a Facebook group under one guise, and then, after gaining a following, change its character to align with their needs. The largest QAnon group on Facebook was formed six years ago, long before the conspiracy’s seeds were sowed.
The QAnon conversation evolves on Facebook
On Facebook, the QAnon conversation also becomes transformed. It’s not necessarily a deep-dive discussion of Q’s latest hint. John F. Kennedy’s assassination comes up; so do videos of Trump’s latest speech.
“One of the biggest things as you get farther into these open social media spaces is that the range of topics that are also discussed and then in turn linked to the larger QAnon conspiracy is significant,” Decker said. People share links on various topics, like local politics or primaries, and this gets mixed in with the “deep state” conspiracy. Everything becomes connected to Q.
“This conspiracy lends itself to support by sort of downtrodden, middle-aged people.”
As the conspiracy went from the fringes of the internet to more mainstream spaces, it started intersecting with the digital community backing Republicans in the upcoming midterm elections. “It’s hard not to see that the open social propagation is intent on influencing voters who may after the first eighteen months of the presidency be disillusioned or frustrated with anything the president is not been able to do, or that he has chosen not to do,” Decker said.
“This conspiracy lends itself to support by sort of downtrodden, middle-aged people who are looking for something, who are looking for some scapegoat for social issues that remain unaddressed.”
Uscinski cautions, however, that the conspiracy’s presence on Facebook doesn’t mean that millions of people could get converted. The filter bubbles that have made Facebook into such a hyperpolarized space also counteract the QAnon expansion: “It spreads to people who are already disposed. We choose our friends, we choose our feeds, we choose the things we like and dislike, we choose the groups we join.”
As a rule, Facebook does not ban conspiracy theories, or misinformation, from its website. One QAnon group was taken down by the site, which a Facebook spokesperson said was because of violations of its bullying policies. But there are many more groups, boasting thousands of members, that flourish.
As the social platforms started publishing their content moderation policies in detail in recent months, Decker said, the groups learned how to police themselves in order to avoid being banned. For example, one of the largest QAnon groups on the platform started calling Congresswoman Maxine Waters, who is African-American, a “nagger,” changing one vowel in the racial slur to evade Facebook’s rules.
“They are all very aware of what the platforms are doing,” said Decker. “And I think they have a pretty good idea of what the platforms are not capable of yet.”