Facebook on Wednesday removed nearly 800 groups associated with the far-right conspiracy group QAnon, as well as taking over 1,500 ads and 100 pages tied to the group in a move to prohibit “violent acts”.
In a blog post, Facebook said the action is part of a broader “dangerous individuals and organizations” policy measure to remove and ban content that has caused real-world violence. This policy will also affect militia groups and political protest organizations such as Antifa.
“While we will allow people to post content supporting these movements and groups, so long as they do not otherwise violate our content policies, we will limit their ability to organize on our platforms,” the company he said.
QAnon supporters believe President Donald Trump is widely thwarting a “deep state” conspiracy to eradicate pedophilia and satanism throughout Washington DC. Conspiracy theorists have recently put a stop to the COVID-19 public health crisis, calling it a “bioweapon”.
QAnon theories hit the mainstream after the #Pizzagate controversy, in which a man brought a gun to a pizzeria, claimed he would be found a victim of child abuse. The group has also been linked to dozens of other violent incidents that stem from baseless theories shared on private Facebook groups and message boards.
Facebook took action against QAnon earlier this month when it downed an influential group with more than 200,000 members, but Wednesday’s move is perhaps the social media giant’s most significant move yet.
The company stated that it would allow QAnon’s content to appear in its recommendations tab, reduce its content in search results, and monetize content to QAnon-related accounts and groups, sell merchandise, fundraise and advertise on both Facebook and Instagram Will prohibit buying. The company plans to continue investigating how QAnon works on its platform, “the specific terminology and symbolism used by supporters to identify the language used by these groups, indicating violence Do and take action accordingly. “
In recent months, other social media sites such as Twitter and TikTok have banned and disabled popular QAnon hashtags and accounts for spreading passive and passive, coordinated behavior and disinformation.
However, don’t expect QAnon to disappear quietly: Experts have called QAnon members “really good” for the online ecosystem, and many QAnon supporters have won primaries for public office on platforms that share within the group Represent the conspiracy theories carried out.