Meta drops fact-checkers

Instagram, which is owned by Mark Zuckerberg’s Meta, will be subject to the new policies announced by the CEO in early January. // Photo by Stockcatalog, Flickr

In a Facebook video released on Jan. 7, Meta CEO Mark Zuckerberg announced that his company will abandon its third-party fact-checking program and adopt a community-driven system inspired by Elon Musk’s app X. These changes will be implemented on Instagram, Facebook and Threads in the US and expanded globally.  

In the video, Zuckerberg called the results of the 2024 Presidential Election a “cultural tipping point towards, once again, prioritizing speech.” He claimed that existing fact-checking systems had made too many mistakes in the past and had removed content that should not have been censored. He also added that the politically biased fact-checkers have “destroyed more trust than they have created.” 

Instead, Meta will phase in a crowdsourced approach to fact-checking content on its platforms. Although more details on this approach are not yet known, it will be inspired by the existing Community Notes model on X. 

Under its model, X allows contributors to anonymously add a “Community Note” with facts and context below a specific post. Only when enough contributors rate a note helpful will it appear below the post. Any user with a six-month-old account, verified phone number and zero violations of X’s rules can become a contributor. A contributor will initially be allowed to rate community notes as helpful. Eventually, they will be allowed to write and attach their community notes.

To discourage mass rating and coordinated manipulation, X uses a bridging algorithm where a note will be shown only if contributors who tend to disagree with their ratings in the past have rated it as helpful. Additionally, X proactively reaches out to contributors, who will likely provide a different perspective, for their input on a particular note. 

Further, Meta will simplify content policies and raise restrictions on topics such as immigration and gender. Meta will also assume a new approach to policy enforcement. While filters will be put out to search for posts with illegal and high-severity violations, action against posts with low-severity violations will be taken only when someone reports them. Lastly, Meta will restart recommending political and civic content to its users.

Moreover, Meta’s Trust and Safety and Content Moderation team will be moved to Texas, and Zuckerberg will work with President Donald Trump to fight governments that force American companies to censor content. 

In response to the criticism of “fake news” on Facebook during the 2016 U.S. elections, Meta launched its fact-checking program that same year. It relied on third-party fact-checking partners, certified by the International Fact-Checking Network (IFCN), to combat misinformation in advertisements and content. 

However, according to Joel Kaplan, Chief Global Affairs Officer at Meta, “too much harmless content gets censored, too many people find themselves wrongly locked up in ‘Facebook jail,’ and we are often too slow to respond when they do.”

“Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact-check and how. Over time, we ended up with too much content being fact-checked that people would understand to be legitimate political speech and debate,” he added.

Amy Bruckman, a professor at the School of Interactive Computing, agrees with Kaplan. 

“Why would we want to delegate our power to decide what we should see and not see to a corporation whose decisions are guided by neoliberal imperatives to maximize shareholder value? We want an entity that will try to do the right thing for individuals and communities, respecting individuals’ different worldviews — not a group trying to maximize profits,” she said in an email to the Technique. 

This development has not been received well by fact-checking organizations. IFCN director Angie Holan argued that fact-checking journalism has only added information and context to controversial claims and debunked hoax content and conspiracy theories. 

The fact-checkers used by Meta follow a Code of Principles requiring non-partisanship and transparency,” she added. 

Others believe that Zuckerberg is aiming to appease Trump.

“I think that Mark Zuckerberg is trying to follow in Elon’s footsteps, which means that actually, they’re going to use this guise of free speech to actually suppress critics of Trump and critics of themselves,” Rep. Alexandria Ocasio-Cortez (D-N.Y.) told Business Insider.

Samuel Woolley, the founder and former director of propaganda research at the University of Texas at Austin’s Center for Media Engagement, also called the Trust and Safety and Content Moderation team’s move to Texas a political decision. 

“The perception of California in the United States and among those in the incoming [presidential] administration is very different than the perception of Texas,” he added.

This move precedes Trump’s inauguration as the new president of the United States, who has spoken vehemently against social media censorship during his campaign. 

“If we don’t have free speech, then we just don’t have a free country… If this most fundamental right is allowed to perish, then the rest of our rights and liberties will topple just like dominos, one by one,” he said in 2022 while announcing his Free Speech Policy Initiative.

As per this proposed policy, Trump will ban federal agencies from collaborating with individuals and organizations and from using federal funds to censor American citizens. He will also pass laws requiring digital platforms to maintain “high standards of neutrality, transparency, fairness, and non-discrimination.” Lastly, he will ask the federal agencies to investigate the “new online censorship regime” and prosecute the parties involved.

Trump announced this initiative in December 2022, at the peak of his campaign. Now that he is back in office, time will tell how his promises materialize. 

Munmun De Choudhury, an Associate Professor at the College of Computing, believes a much bigger and more nuanced discourse on users’ responsibility must also occur. She argues that companies should be accountable for the content on their platform. Nevertheless, users should also use their critical thinking skills while consuming that content.

“Yes, there is a need for some form of fact-checking and community involvement to help us discern what is right or wrong,” she said. “But also we as informed citizens need to exercise critical thinking and recognize what pieces of information should be trusted or not.”

Advertising