Meta, the parent company of Facebook and Instagram, has announced major changes to its content moderation policies aimed at restoring free expression on its platforms.

The decision, revealed Tuesday by CEO Mark Zuckerberg, includes eliminating the company’s third-party fact-checking program and replacing it with a community-driven model similar to X’s (formerly Twitter) Community Notes.

“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said in a video posted Tuesday morning.

“More specifically, we’re going to get rid of fact-checkers and replace them with Community Notes similar to X, starting in the U.S.”

Celebrate Trump's Historic 2024 Victory with the Exclusive Trump 47th President Collection!

Zuckerberg acknowledged that Meta’s content moderation practices, implemented after the 2016 election, were designed to manage misinformation amid political pressure but admitted that the system had “gone too far.”

Do you think Elon Musk should purchase Facebook?

By completing the poll, you agree to receive emails from RVM News, occasional offers from our partners and that you've read and agree to our privacy policy and legal statement.

Joel Kaplan, Meta’s chief global affairs officer, elaborated on the changes during an exclusive interview on Fox News Channel’s Fox & Friends.

“This is a great opportunity for us to reset the balance in favor of free expression,” Kaplan said. “As Mark says in that video, what we’re doing is we’re getting back to our roots and free expression.”

The third-party fact-checking program, established to address concerns about misinformation, had been criticized for its perceived political bias and overreach.

Meta executives said the changes reflect a desire to move away from these controversial practices and promote open dialogue.

The announcement was met with mixed reactions.

Critics from legacy media outlets, including CNN’s Brian Stelter, expressed concern that the changes could lead to the spread of misinformation.

Fact-checking organizations voiced disappointment, suggesting the move could undermine efforts to combat false information on social media platforms.

Proponents of the changes, however, argue that the shift marks a step toward protecting free speech. “This is about creating an environment where people can share ideas, even if they’re controversial or wrong,” Kaplan said.

The decision to scrap fact-checkers and adopt a community-based model reflects Meta’s broader push to simplify its content moderation policies while reducing censorship errors.

By implementing Community Notes, Meta aims to provide a transparent and collaborative approach to evaluating content without relying on third-party organizations.

The move also aligns with growing calls for social media platforms to prioritize free speech and reduce the influence of political biases in content moderation decisions.

As Meta rolls out its new policies, the platform’s focus on free expression is likely to spark further debate about the role of social media in managing online discourse.

Zuckerberg’s announcement signals a significant departure from the company’s previous approach, aiming to foster a more open environment for users to express their views.

The rollout of Community Notes is expected to begin in the U.S., with Meta gradually expanding the system to other regions.

While the changes may face criticism from certain quarters, they mark a notable shift in the way one of the world’s largest social media platforms approaches content moderation.