Meta Embraces Fact-Checking Program That X Users Say Is like ‘Whack-a-Mole’

REUTERS/Manuel Orbegozo
Meta CEO Mark Zuckerberg makes a keynote speech at the Meta Connect annual event at the company’s headquarters in Menlo Park, California, U.S., September 25, 2024.

Before the November election, Walt Wang said he spent 10 to 20 hours a week debunking falsehoods on X, the social media platform owned by Elon Musk. As election-related lies and propaganda spread throughout the site, he carefully crafted responses, backed by reputable sources, to counter the false claims before they reached millions of users.

But often, he said, his efforts felt “similar to a game of whack-a-mole.” He spent hours debunking one conspiracy theory, only to watch another crop up moments later.

Wang is part of a legion of X users who volunteer their time policing misinformation on the site through a feature called Community Notes, which Meta said Tuesday it will adopt in place of its extensive fact-checking program in the United States.

While dedicated contributors like Wang praise the program for giving moderation power to a diverse set of users, they warn that it’s an inadequate replacement for professional fact-checking – and fear it will be even harder for everyday people to combat falsehoods on Meta’s array of global platforms, including Facebook, Instagram and Threads.

“An entirely voluntary group of people doesn’t have the manpower or resources to debunk all of the conspiracy theories,” Wang, a 27-year-old who works in health care, said over X direct message. “Inevitably, an increasing number of conspiracy theories will slip through.”

In moving to a crowdsourced approach, Meta is shifting the onus of containing falsehoods on some of the world’s largest social networks to ordinary users like Wang, who will do the work without pay or training. The company is following a path blazed by Musk into a new era where social media giants prioritize freewheeling discourse over concern for the harms that can arise when incendiary falsehoods or conspiracy theories go viral – a shift Meta CEO Mark Zuckerberg characterized on Tuesday as getting “back to our roots.”

Zuckerberg himself acknowledged the move to a user-based program is a “trade-off” that will inevitably allow more “bad stuff” to circulate on Meta’s platforms.

And while researchers have hailed Community Notes as a novel idea in content moderation, online misinformation experts have long argued that volunteers are not an adequate replacement for trained employees. The policy, many researchers say, will lead to more falsehoods swirling around the internet that will seep their way into the real world – a shift with unknown consequences for society.

“Community Notes can bring different viewpoints,” said Shannon McGregor, an associate professor at the University of North Carolina at Chapel Hill who studies social media platforms. “But viewpoints are different than a skill set of adjudicating the veracity of information.”

On X, any user can request to join Community Notes. Once accepted, they can suggest a note on any post they argue is incorrect or needs more context. Notes that receive enough backing from contributors with “different perspectives” are displayed publicly, according to the company – a system meant to ensure bipartisan agreement.

Even with a global network of fact-checkers, Meta’s program has struggled to control the spread of misinformation across its platforms, where algorithms can elevate divisive content. The initiative has also been assailed in conservative circles as a form of censorship, including by President-elect Donald Trump. On Tuesday, Zuckerberg said the move was necessary to meet the “cultural tipping point toward once again prioritizing speech” triggered by Trump’s presidential victory. (Trump, who has railed against fact-checking, praised the policy, saying, “I think they’ve come a long way.”)

Tech giants such as Meta “never wanted responsibility for all the content posted online,” said Brendan Nyhan, a professor of political science at Dartmouth.

He described moderation as a “lose-lose” for platforms trying to balance pressure from many different parties. “Moving toward less restrictive policies and crowdsourced fact-checking reduces the amount of blame they can take.”

Meta said it will phase in its version of Community Notes over the next several months and refine it over the course of 2025. A company spokesperson declined to comment on exactly how it will resemble or differ from X’s.

David Inserra, a fellow for free expression and technology at the libertarian Cato Institute, said Meta is “taking a bold step toward enhancing free expression online” by embracing the crowdsourced fact-checking approach.

“This gives users a greater voice to discuss and debate various issues,” he said in a statement. “These steps should be cheered as a private company taking practical actions to increase the expression of its 3 billion users.”

But Yael Eisenstat, Facebook’s former head of election integrity for political ads in 2018, countered that by outsourcing content moderation, the companies are abdicating their responsibilities to the public. X and Meta are not “neutral pipes through which speech flows,” she said, but “curated feeds based on algorithmic decision-making and priorities set by the owners.”

“Every decision they make is a decision with what they are going to do with other people’s speech,” said Eisenstat, who is now a senior fellow at Cybersecurity for Democracy, a multi-university nonprofit.

The move by Meta reflects a trend across social media companies to roll back Biden-era moderation policies in response to a years-long conservative campaign against fact-checking and content moderation. Musk has championed the shift since taking over X in 2022, dismantling the company’s trust and safety team, and welcoming back accounts previously banned for repeatedly spreading harmful misinformation.

He positioned the moves as part of a mission to restore “truth” and “free speech” to a social network he said had been tilted to favor left-wing views, touting Community Notes as “the best source of truth on the internet.” X did not respond to a request for comment.

Marco Piani, a 47-year-old physicist, said he used to spend substantial time crafting Community Notes. He joined the program after Musk acquired Twitter, and he spent time debunking claims around the covid-19 pandemic.

Recently, he has pulled back from the feature, saying it doesn’t feel worth his time. The bulk of his proposed notes have been stuck in limbo as other users debate whether they are worthy, he said. Meanwhile, he watches helplessly as misleading posts spread rapidly. Piani’s frustrations are backed up by data analysis by The Washington Post, which found the feature often fails to provide a meaningful check on misinformation.

“All those contributions are just lost,” he said.

Despite the program’s shortcomings on X, Wang said he still does what he can “in pursuit of the truth.”

Wang said he has crafted notes that have appeared on posts from prominent accounts, including those of Musk and Rep. Marjorie Taylor Greene (R-Georgia). He said he mostly focuses on issues involving health care, data and AI deepfakes, which “tend to be the easiest topics” to debunk. He tries to avoid divisive political topics, such as claims that the 2020 election was stolen, which he said “would require too many words and points to be argued” given the feature’s limitations.

In a recent post where a prominent person falsely said the man behind the New Year’s Day attack in New Orleans was a “migrant” terrorist, Wang offered an extensive fact check that the assailant was in fact an American citizen – a post backed up by nearly a dozen sources of information. His correction was affixed to the post, which as of Wednesday had 4 million views.

“While I can’t change everyone’s opinion, I can at least demonstrate the logical fallacies so a few people can be persuaded by the facts,” he said.