As the 2020 election heats up, social media sites and tech companies have had to shut down disinfection campaigns designed to influence the vote. But this year in particular, Tech Giants have a new – and quickly spreading – concern: the QAnon conspiracy theory.
What is QAnon?
QAnon is a baseless far-fetched conspiracy theory, originating from an anonymous figure designated as “Q” on 4chan’s message boards in October 2017.
The anonymous figure claims to be a high-status government official within the Trump administration who has access to classified intelligence, which intermittently signals code for supporters to disband. When 4chan was dissolved in 2018, Q “drops” moved to 8chan, and is now reported to be at 8kun.
Proponents of QAnon believe that President Trump is fighting against liberal Washington, DC, politicians and Hollywood people battling Satanism, sex trafficking, and pedophilia.
None of Q’s predictions have come true, but supporters are still sticking to the plot and quickly able to adapt to the growing online trends for germ spread. QAnon supporters have also given new life to old, re-organized conspiracies like #Pizzagate – which began to revolve around a Washington, DC, pizzeria cluttered charge of a sex-trafficking ring during the 2016 presidential election.
The coronovirus pandemic is also a talking point for the group, which cites COVID-19 as false, a “bypass.” Recently, QAnon supporters have co-opted the #SaveTheChildren hashtag to promote the false belief that “elites” use children’s blood to extend their lifetimes.
How has QAnon spread?
It is unknown how many people seriously believe in the QAnon conspiracies. Experts studying conspiracy theories told that QAnon’s supporters are “deeper than they are detailed” – meaning QAnon is an ideological devotee who invests in theories, while most Americans believe Has no idea what the plot does as a whole, or has only heard parts of it.
However, thanks to social media platforms such as YouTube, Facebook, Twitter, Instagram and TikTok, QAnon is no longer a fringe group, but is mainstream. Posts and accounts surrounding the QAnon conspiracy theory have garnered thousands of clicks in users’ engagement – often, inadvertently promoted by the site’s algorithms.
Supporters have used popular social media platforms to coordinate troll-like behavior and attacks, promoting their hashtags, and swishing millions to private groups, hiring more than hours – and their Taking the rhetoric out of the shadow of the dark web.
What are Facebook, Twitter and others doing against QAnon?
Tech giants have only begun to clamp down on QAnon accounts, groups and advertisements, as real-world violence followed by online conspiracies.
Twitter, the first to crack down on the group, banned thousands of QAN accounts and stopped trending its popular keywords and symbolism, for example, “# WWG1WGA,” meaning “where we are one, we’re all . “
Tikotok disabled the popular Qion-related hashtag, as teenagers on stage began to take notice.
Facebook recently stated that it would not allow QAnon groups to buy advertisements and would take down hundreds of accounts conspiring on the site. And YouTube announced to down-rank QAnon content from appearing on its home page, although recent digital trends found that QAnon content still appears in users’ feeds.
Despite the efforts of social media platforms, QAnon supporters are agile and coordinated and have proved adept at dodging Big Tech’s efforts to limit the spread of conspiracy theory. The content of QAnon can be found quite easily, if you know what any algorithm should do to search to create sites.
Despite increased content moderation, social media companies are not able to completely erase QAnon content from their platforms without violating their own policies.