Meta Rolls back COVID Misinformation Rules

Washington Post photo by Matt McClain
Meta CEO Mark Zuckerberg, seen testifying before the House in 2018, was hauled in before Congress again in 2021 to address his company’s handling of coronavirus misinformation.

Meta is rolling back its policies against covid misinformation globally, ending the rules in countries like the United States where the pandemic is no longer considered a national emergency while keeping them in place where officials say the threat remains high.

The tech giant’s oversight board in April recommended that it “maintain” its rules against misleading coronavirus content until global health authorities removed the pandemic’s emergency status, a step the World Health Organization took two weeks later in May.

The board criticized the Facebook and Instagram parent company at the time over its “insistence” on taking “a single, global approach” to the issue rather than developing a “localized” policy that took into account the pandemic’s progression on a regional basis.

The group, made up of legal experts and advocates, wrote in an advisory opinion that “by ruling out this option, Meta has frustrated the Board’s efforts to reconcile competing viewpoints from stakeholders and Board Members on how to best address harmful COVID-19 misinformation.”

The company’s new policy shift, which will take effect in the coming weeks, brings its position more in line with the board’s recommendation.

“In countries that have a Covid-19 public health emergency declaration, we will continue to remove content for violating our Covid-19 misinformation policies given the risk of imminent physical harm,” Meta said in a post Friday shared first with The Washington Post.

It added, “Our Covid-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted.”

The company said it will consult with health experts to determine which specific claims could still face removal in countries where emergency declarations are in place. Meta first asked for its oversight board’s opinion on the issue in July 2022.

The social media giant’s rules on medical misinformation sparked major political backlash, with Democrats accusing it of failing to stamp out harmful claims about the virus and Republicans slamming it for stifling views about the pandemic’s risks.

House lawmakers hauled in CEO Mark Zuckerberg to testify on the topic alongside other top tech executives in March 2021, hammering the companies over their response to the pandemic.

Criticism of Meta’s approach reached a fever pitch in May 2021 after the company said it would no longer remove claims that covid-19 was manufactured or man-made amid renewed debate about the origins of the virus, as The Post first reported.

Two months later, President Biden said social media platforms like Facebook were “killing people” by not policing more forcefully against misinformation about vaccines.

The company, since renamed Meta, said at the time that Biden’s claim was not “supported by the facts.” Biden later walked back the remarks, laying the blame on misinformation spreaders.

Twitter in November disclosed that it would no longer enforce its covid misinformation rules, taking the step shortly after Elon Musk took over the platform and months before the United States or WHO ended their health emergency declarations.

Instagram earlier this month reinstated the account of Robert F. Kennedy Jr., a prominent vaccine skeptic and the nephew of the late president John F. Kennedy, after he announced he is running for president as a long-shot Democratic challenger to Biden.

The company removed his account in 2021 for “repeatedly sharing debunked claims about the coronavirus or vaccines.”

YouTube’s policies as of Thursday stated that it still prohibits “content that denies the existence of the coronavirus or encourages the use of home remedies in place of medical treatment” or that “explicitly disputes the efficacy of global or local health authority advice regarding social distancing that may lead people to act against that guidance.”

“YouTube’s policies on COVID-19 are subject to change in response to changes to global or local health authorities’ guidance on the virus,” the Google-owned company’s policies stated.