Other Silicon Valley companies, including Facebook and YouTube, said they were also considering crackdowns on the movement, some of whom are the president’s most extreme supporters.
The social media company late Tuesday said it banned 7,000 QAnon accounts for violating its policies, including by organizing harassment of other Twitter users or by trying to get around an earlier suspension. It has also limited the reach of QAnon-related accounts by no longer surfacing the accounts as recommendations, not highlighting them in search and blocking QAnon-related URLs from being shared. The action against QAnon, which will affect about 150,000 accounts, means Twitter will de-emphasize the group as a whole as the company works to cut off the rampant spread of conspiracy theories on its site.
“We’ve been clear that we will take strong enforcement action on behavior that has the potential to lead to offline harm,” the company’s safety team wrote in a tweet. “In line with this approach, this week we are taking further action on so-called ‘QAnon’ activity across the service.” News of Twitter’s crackdown was earlier reported by NBC News.
Twitter, along with other social media sites, has become a breeding ground for conspiracy theories. Supporters spread misinformation, coordinate harassment against public figures, and organize real-life protests. Their activity reached a fever pitch during the first months of the coronavirus pandemic when some protests calling for businesses to reopen were tied to members of darker Internet subcultures, including QAnon believers.
A flood of misinformation about the novel coronavirus pushed Twitter, Facebook and YouTube to instate new policies and refer people searching for information to the Centers for Disease Control and Prevention. But they’ve struggled to keep up with groups like QAnon. For example, in May the QAnon-focused groups on Facebook were influential in spreading the ‘Plandemic’ documentary that falsely claimed billionaires intentionally helped spread coronavirus, according to social media researcher Erin Gallagher. The video was viewed by millions of people before the companies cut it off.
QAnon accounts have become even more prominent in spreading Twitter misinformation during the pandemic, University of Washington professor Kate Starbird said about the crackdown.
The impact of Twitter’s decision will likely be somewhat mixed, she said — in a way, it might fuel the conspiracy theory followers to rally together and cry censorship. But it will also mean that the harmful and misleading content will be harder to find, making it tougher to recruit new members.
“Removing some of the inorganic activity and just dampening, limiting the visibility of their activity, can perhaps make a dent in some of the misinformation flows we’ve been seeing,” Starbird said.
Some of the QAnon accounts are big influencers with hundreds of thousands of followers, she said. More are just everyday users who may follow a few thousand accounts, but only have a dozen or a couple hundred followers themselves.
Twitter’s move will also serve as an example for other social media companies considering taking their own action against the group, she said.
A person familiar with Facebook’s thinking who requested anonymity because the plan is not yet public confirmed reporting in the New York Times that Facebook also plans to limit the reach of QAnon related posts.
YouTube spokesperson Farshad Shadloo said the video streaming site is also working to reduce the spread of QAnon videos as part of its push not to amplify what it calls “borderline” content. That policy allows videos to remain online while not being recommended widely.
The baseless QAnon theory is rooted in the belief that President Trump is really working to weed out a network of sex predators that hold positions of power in the political world and Hollywood. Followers look for posts pushing the theory from an anonymous figure known as Q. The theory has ties to the Pizzagate theory, which spread false beliefs about Hillary Clinton, a group of sex offenders and a pizza shop in 2016.
QAnon groups have been highly active in supporting President Trump and in pushing ideas in his direction. On Mother’s Day weekend this year, the president retweeted accounts that promoted the QAnon conspiracy theory about Democratic involvement in a pedophilia cult. QAnon accounts were also major promoters of the anti-malaria drug hydroxychlororquine that was touted repeatedly by the president as a treatment for the coronavirus. The drug has been shown to be no better than a placebo in patients who were not hospitalized.
Twitter is not the first platform to take action against QAnon. In 2018, Reddit banned /r/GreatAwakening, its largest community affiliated with the group, after saying that the group broke its rules against promoting violence.
For years, however, tech platforms have enabled the growth of QAnon because social media companies do not ban misinformation. For the first time this year, the companies have prohibited misinformation related to the coronavirus and voting.
Online accounts associated with the QAnon conspiracy theory have played a significant role in spreading misinformation about major public events, including the pandemic. They were also affiliated with the ReOpen movement that encouraged Americans to protest the lockdowns.
Twitter has been taking a leading position in better policing tweets that violate its policies in the lead-up to the election, including by slapping warning labels on five of Trump’s tweets. Facebook said last month it would follow suit, to a certain extent, and label some posts from public officials that violated its policies but that it deemed newsworthy enough to keep online.
QAnon’s hold in the offline world, bolstered by the reopening protests, has moved into the political realm. Nearly 600,000 people have voted for candidates who have at one point shown support for the conspiracy theory, according to an analysis of Media Matters data.
View original article here Source