On Monday, Facebook announced that it was taking more aggressive steps to combat conspiracy theories and misinformation about vaccines. In a statement detailing its steps to promote authoritative information about COVID-19 vaccines, the social media giant shared an expanded list of false and dangerous claims about vaccines that will not be allowed on the platform.
Facebook initially announced in December that it would get rid of unproven theories about the coronavirus vaccines on both Instagram and Facebook — a move made right as the first vaccines were being approved for use. The initial list of false information the platform said it would remove included “false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list,” claims that COVID-19 didn’t exist or was no worse than the flu and claims that COVID-19 was caused by or linked to 5G communications technology.
The expanded list of false information that will be removed now includes claims that COVID-19 is man-made, that vaccines are not effective against the diseases they are meant to prevent, that vaccines cause autism and that it is safer to get COVID-19 than receive the vaccine.
Facebook said these new standards will go into effect “immediately,” adding that accounts, pages and groups that “repeatedly share these debunked claims” on Facebook and Instagram may be removed.
Facebook launched its first major campaign against misinformation in 2016 after U.S. intelligence agencies determined Russia meddled in the U.S. elections. The company formed an alliance with fact-checking organizations, but had been reticent to outright ban or remove false or misleading posts. Instead, for years, it merely added a pop-up dialog window that warned of incorrect or unproven information. Users were still able to read the posts, comment and click on included links.
Facebook continued adding these warnings to posts containing misinformation throughout the 2020 election, including flagging a great number of posts from former President Trump, who repeatedly spread unsubstantiated claims about voter fraud. Following the deadly Capitol riot on January 6, Facebook locked Mr. Trump’s account and banned him from the platform a day later.
Prior to Mr. Trump’s ban, and after facing pressure for years, the platform also attempted to crack down on hate speech by blocking the likes of Alex Jones, Louis Farrakhan and Milo Yiannopoulos.
When it came to anti-vaccination content, however, an earlier pledge to battle misinformation was focused more on reducing the visibility of hoaxes and misinformation in News Feeds and searches, rather than removing the content altogether.
The anti-vaccine movement has grown rapidly in recent years. Myths about the potential harm of vaccinating children persist, despite multiple studies showing vaccines do not cause autism. Some experts have blamed social media platforms’ unwillingness to regulate vaccine misinformation for an outbreak of measles in 2015, a disease that had previously been eradicated in the U.S. in 2000.