Meta is ending its fact-checking program in favor of an X-like ‘community notes’ system

Meta CEO Mark Zuckerberg announced several major changes to the company’s policies and practices on Tuesday, citing a changing political and social landscape and a desire to embrace free expression.

Zuckerberg said Meta would end its fact-checking program with trusted partners and replace it with a community-led system similar to X’s Community Notes.

Zuckerberg said the company is also making changes to its content moderation policy around political topics and de-politicization that reduce the amount of political content in users’ feeds.

The changes will affect Facebook and Instagram, the world’s two largest social media platforms, each with billions of users, as well as Threads.

“We’re going back to our roots and focusing on reducing errors, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said in a video. “More specifically, here’s what we’re going to do. First, we’re going to get rid of fact-checkers and replace them with community notes similar to X, starting in the United States.”

Zuckerberg pointed to the election as a major influence on the company’s decision and criticized “governments and legacy media” for allegedly pushing for “more censorship.”

“The recent election also feels like a cultural shift toward, once again, prioritizing speech,” he said.

“So we’re going to go back to our roots and focus on reducing errors, simplifying our policies, and restoring free speech on our platform.”

He also said that the systems the company has built to moderate its platform are making too many mistakes, adding that the company will continue to moderate heavily on drugs, terrorism, and child exploitation.

“We’ve built a lot of complex systems to moderate, but the problem with complex systems is that they make mistakes,” Zuckerberg said. “Even if they’re only randomly reviewing 1% of posts, that’s millions of people, and we’ve reached a point where there’s just too many mistakes and too much censorship.”

In addition to ending the face-checking program, Zuckerberg said the company will remove some content policies around hot-button issues, including immigration and gender, and refocus the company’s automated moderation system on what he called “high-severity violations” and rely on users to report other violations.

Facebook will also be moving its trust and safety and content moderation teams from California to Texas.

“We will also adjust our content filters to require a higher level of confidence before removing content,” he said. “The reality is that this is a trade-off. It means we will catch less bad stuff, but we will also reduce the number of posts and accounts of innocent people that we accidentally remove.”

The move comes as Meta and social media companies in general have reversed course on content moderation in recent years, due in part to the politics of decision-making and moderation programs. Republicans have long criticized Meta’s fact-checking system and fact-checking in general as unfair and favoring Democrats, a contentious claim.

X’s community notes system, which CEO Elon Musk has used to replace the company’s previous efforts around misinformation, has been hailed by conservatives, and it allows for a mix of fact-checking, hoaxes and other community-boosting behaviors.

Zuckerberg’s announcement comes as CEOs and business leaders across sectors are growing increasingly comfortable with the incoming administration of President-elect Donald Trump. Meta, along with other tech companies, donated $1 million to Trump’s inaugural fund, and before the election, Zuckerberg praised Trump in an interview with Bloomberg TV without giving a full endorsement. Before Trump’s inauguration, Meta reportedly appointed Republican Joel Kaplan to lead its policy team, and on Monday the company announced that UFC’s Dana White, a longtime Trump supporter, would join its board.