Meta, the tech company behind Facebook, Instagram and Threads, announced a radical change in its content moderation policy on Tuesday.
The company will discontinue the third-party fact checking program, which was introduced in 2016 as a way to combat misinformation. Instead, it will rely on user-generated comments to flag potentially false and misleading posts.
The new approach is based on Elon Musk’s Community Notes, which reflect Meta’s pivot towards greater freedom of expression.
The social media giant announced that it would “allow for more speech by lifting some restrictions on topics that are part and parcel of mainstream discourse, and focusing on illegal and severe violations” as well as “take a more personal approach to political content”.
Mark Zuckerberg, CEO of Meta, in a video.
It’s time we get back to the roots of free expression.
Zuckerberg acknowledged the current system was too complex, resulting in “too many errors and censorship.”
What are the changes?
Zuckerberg said that Meta will also change its systems in order to “dramatically decrease” the amount that its automated filters remove.
This includes lifting restrictions on topics like immigration and gender to focus on “illegal and severe violations” such as child exploitation and fraud as well as content related suicide, self-injury, and eating disorders.
He acknowledged that the change could lead to an increase in harmful content, but said it would help those whose posts or accounts were accidentally removed.
“This is a tradeoff in reality,” he said.
We’ll catch less bad stuff but we’ll also reduce how many innocent people’s accounts and posts we accidentally remove.
This system will be implemented in the United States over the next few months. If successful, it is planned to be expanded.
Meta’s shares fell by 1.47% as of 1:54 pm.
Joel Kaplan and Trump ally Dana White in Meta
Meta’s decision seems to align with the new Trump administration, and shows a recalibration in the company’s policy to fit the political landscape.
The announcement was made just one day after UFC founder Dana White, a Trump ally, was appointed to Meta’s Board of Directors.
White joins Marc Andreessen – a tech investor who is a staunch advocate of reduced content moderation – in helping to shape Meta’s governance strategies.
The timing of the announcements, along with the policy reversal shows how ardently Meta is trying reposition itself under the Trump presidency as a proponent for free speech.
Joel Kaplan, Meta’s newly appointed global chief policy officer, who played a major role in the announcement on Tuesday, described this shift as a needed reset.
Kaplan stated in an interview with Fox and FriendsTHE STEET that “this is a wonderful opportunity for us to reset balance in favor of freedom of expression.”
Kaplan said that the previous fact-checking system of the company had become “biased”, and the company wanted a return to its roots, which included more unfettered speech.
He cited Elon Musk’s X as a model for a system that has few rules, and allows users to moderate one another.
I think Elon played a very important role in moving this debate forward and refocusing people on the importance of free expression.
Texas to host content moderation staff
As part of its overhaul, Meta plans on moving its US-based content moderators from California to Texas.
Zuckerberg said that the relocation is intended to rebuild trust and address concerns about bias in the moderation process.
Texas, a state known for its conservative values is seen as an excellent strategic choice.
Critics argue that this move could further polarize public sentiment by aligning Meta’s operations with conservative ideologies.
Safety advocates raise alarms about increased risks
Safety advocates have warned of dire consequences for vulnerable users, in particular.
Ian Russell, a prominent advocate for online safety, whose daughter Molly committed suicide after viewing harmful content on Instagram expressed his dismay.
This could have serious consequences for children and young adults.
Digital safety experts are also concerned about Meta’s decision to end partnerships with news organizations and fact-checkers.
Many people fear that misinformation could spread more quickly without the rigorous checks that were previously in place.
Governments and media organizations around the world have also criticised.
Zuckerberg targeted restrictive regimes in China, Latin America and European laws that he claimed “institutionalize censorship”.
He described the decision as a part of a larger effort to push back on global regulations that impede innovation and freedom of expression.
Meta’s policy is similar to Elon Musk’s “X”
The new design is a reflection of Elon Musk’s strategy for X, formerly Twitter. Community Notes, a user-generated moderating tool, has become a cornerstone in the platform’s fight against misinformation.
Musk, a major Trump contributor, has increasingly positioned X to be a platform of unfettered freedom of speech, often provoking controversy.
Zuckerberg’s decision is part of a wider trend among tech giants that are moving away from centralized moderating in favor of community-driven systems.
Critics warn that while this may reduce operational complexity, it places an excessive amount of responsibility on the users to police misinformation.
Since Trump’s election in November, many major corporations have shown a clear alignment with the president-elect.
During the transition to the new president, Meta made a number of announcements that reflected what CEO Mark Zuckerberg called a “cultural turning point” brought on by the election.
Trump was asked to comment on Meta’s announcement during a separate news conference that he hosted at Mar-a-Lago.
Trump said that he found Joel Kaplan’s Fox interview “impressive” and added that the company has “come a very long way.”
A few moments later, Trump admitted that the change in policy was “probably” the result of threats he had made against the company, and its leader Mark Zuckerberg.
This post Meta ends fact checking efforts as it aligns itself with the incoming Trump administration first appeared on ICD
This site is for entertainment only. Click here to read more