Facebook is cracking down on COVID-19 and vaccine misinformation

Published , by Sam Chandler

Facebook has had a tricky history with the spread of misinformation, and now, the major social media platform is taking direct action to remove false claims about COVID-19 and vaccines. This new stance on misinformation surrounding vaccines will be in effect immediately, with the company focusing on Pages, groups, and accounts.

Instagram will also see the effects of this change.

In an update posted on February 8, 2021, Facebook outlined its steps in tackling how misinformation spreads on its platform. There is a full list of claims that will be removed, but the post highlights four main areas:

The removal of misinformation also extends to topics such as the availability of essential services, claims about “cures”, how COVID-19 is transmitted, and the severity of COVID-19.

Accounts that continue to share debunked claims surrounding vaccines and COVID-19 “may be removed altogether”, reads the post. Facebook is also altering the search function to ensure that results provide people with “relevant, authoritative results” and “expert information about vaccines.” As part of this change, accounts that discourage people from getting vaccinated will be ranked lower.

These changes follow an update from December of 2020 where Facebook started notifying people who had interacted with misinformation about COVID-19. The changes were aimed at ensuring people knew why a post was removed and why the information was false, while also providing facts about COVID-19. The new steps Facebook is taking will be, according to the post, in effect “during the pandemic”. Whether these positive changes remain in effect after the pandemic remains to be seen.