Facebook Is Deleting Posts That Make False COVID-19 Claims
Users spreading vaccine misinformation could get banned.
Facebook is pledging to take a harder line against vaccine misinformation. The platform announced on Feb. 8 that it would immediately begin banning users, groups and pages that have been repeatedly spreading false claims about the coronavirus and COVID-19 vaccines. The ban extends to Instagram, which Facebook owns.
In a post in their newsroom on Feb. 8, the social media giant announced they are updating their community standards in order to reflect their renewed focus on preventing COVID-19 myths from being spread on their platform. The company has added four new categories to an already exhaustive list of false posts that can get users banned for spreading misinformation about the vaccine or virus, including those that claim:
“COVID-19 is man-made or manufactured.”
“Vaccines are not effective at preventing the disease they are meant to protect against.”
“It’s safer to get the disease than to get the vaccine.”
“Vaccines are toxic, dangerous or cause autism.”
“This is based on guidance from public health organizations that pervasive misinformation about COVID-19, and vaccines more broadly, is contributing to COVID-19 vaccine hesitancy that could have immediate and long-term physical health harms for people around the world,” a Facebook spokesperson told the Huffington Post.
“We will begin enforcing this policy immediately, with a particular focus on Pages, groups and accounts that violate these rules, and we’ll continue to expand our enforcement over the coming weeks,” Guy Rosen, Facebook’s vice president of integrity, wrote in the post. “Groups, Pages and accounts on Facebook and Instagram that repeatedly share these debunked claims may be removed altogether.”
Other claims about COVID-19 and vaccines that don’t specifically violate the policies could still be reviewed by third-party fact-checkers and be labeled and demoted, he added.
On Instagram, Rosen wrote, “In addition to surfacing authoritative results in Search, in the coming weeks we’re making it harder to find accounts in search that discourage people from getting vaccinated.”
As of November, Facebook had removed 12 million posts containing fake COVID-19 claims, but as false information about the vaccine continues to spread, the social media platform decided to tighten up their posting guidelines even further.
Experts have expressed concern that Instagram is quickly becoming a superspreader when it comes to fear-mongering false claims about the COVID-19 vaccine, though.
“A lot of the accounts that were removed from the Facebook platform remain active on Instagram, with enormous follower counts,” Anna-Sophia Harling, the head of Europe for NewsGuard, told The Guardian in January. “Instagram has a huge Covid-19 vaccine problem.”