Meta ends third-party fact-checking program to prepare for Trump’s return

Facebook owner Meta is ending its third-party fact-checking program and instead relying on users to flag misinformation as the social media giant prepares for a return to the presidency by Donald Trump.
The $1.6 trillion company said on Tuesday it would “allow more speech by lifting restrictions on certain topics in mainstream discourse and focusing enforcement on illegal and serious violations” and “take action against political content.” A more personalized approach”.
“It’s time to return to the roots of free speech on Facebook and Instagram,” Meta CEO and co-founder Mark Zuckerberg said in a video post.
Trump sharply criticized Zuckerberg during last year’s U.S. presidential campaign, suggesting he would “spend the rest of his life in jail” if Mehta interfered in the 2024 vote.
But the Facebook founder has tried to rebuild his relationship with the president-elect after his victory in November, including visiting him at his Mar-a-Lago residence in Florida.
On Monday, Meta appointed UFC founder and prominent Trump supporter Dana White to its board of directors, further making progress with the incoming U.S. presidential administration.
White will serve on Meta’s board alongside tech investor Marc Andreessen, another Trump ally who has long pushed the company to loosen regulations on online content.
Zuckerberg said the complexity of its content moderation system, which expanded in December 2016 after Trump was first elected, introduced “too many errors and too much censorship.”
Starting in the U.S., Meta will move to a so-called “community annotation” model, similar to the one employed by Musk’s X, which allows users to add context to controversial or misleading posts. Meta itself does not write community notes.
Mehta said there are “no immediate plans” to end third-party fact-checking and introduce community annotation outside the United States. It is unclear how such a system would comply with regimes such as the EU’s Digital Services Act and the UK’s Online Safety Act, which require online platforms to take steps to deal with illegal content and protect users.
Meta will also change its systems to “significantly reduce” the amount of content removed from the platform by automated filters, Zuckerberg added.
This includes removing restrictions on topics such as immigration and gender, focusing its system on “illegal and highly serious offenses” such as terrorism, child exploitation and fraud, as well as content related to suicide, self-harm and eating disorders.
He acknowledged that the changes mean Meta “will catch less bad content,” but argued that the trade-off is worth it to reduce the number of “innocent” posts that are removed.
The changes bring Zuckerberg closer to Musk, who in 2022 drastically cut back on content moderation after acquiring the social media platform then known as Twitter.
“Like they did on X, community annotations will require consensus among people with different viewpoints to help prevent biased ratings,” Meta said in a blog post.
“This is so cool,” Musk said of Meta’s changes in an X post.
Prominent Republican Joel Kaplan, who was announced last week as replacing Sir Nick Clegg as president of global affairs, told Fox News on Tuesday that its third-party fact-checkers were “too biased.”
Referring to Trump’s return to the White House on January 20, Kaplan added: “We have a real opportunity now, we have a new administration and a new president who are great defenders of free speech, and that makes it possible There is a difference.
As part of the changes announced Tuesday, Meta also said it would move its U.S.-based content reviewers from California to Texas. “I think it will help us build trust by doing this work in a place where we’re less worried about team bias,” Zuckerberg said.
Meta’s changes were slammed by cybersecurity activists. Ian Russell, whose 14-year-old daughter Molly took her own life after viewing harmful content on sites such as Instagram, said he was “dismayed” by the plans.
“These measures could have dire consequences for many children and young people,” he said.
Zuckerberg first introduced third-party fact-checking in late 2016 as part of a series of measures aimed at countering criticism of widespread misinformation on Facebook.
He said at the time that the company needed “more robust detection” of misinformation and would work with the journalism industry to learn from journalists’ fact-checking systems.
Meta said the company now spends billions of dollars a year on safety and security systems and employs or employs tens of thousands of people around the world.
But on Tuesday, Zuckerberg accused governments and “legacy media” of pushing his company toward “increasing scrutiny.”
He said Mehta would work with the Trump administration to “counter the efforts of governments around the world to go after U.S. companies and push for increased scrutiny.”
He pointed to restrictive regimes in China and Latin America and highlighted what he called “a growing number” of European laws that “institutionalize censorship and make it difficult to establish any innovation there.”
Meta shares fell 2% to $616.11 on Tuesday morning.