Meta replaces third-party fact-checking with ‘Community Notes’ ahead of Trump’s second act

|

|

Last update:

Meta has announced changes to its content management and moderation policies, aiming to balance free expression and responsible enforcement on its platforms. 

The announcement comes as Donald J Trump is set to assume the presidency for the second time. Notably, Meta also donated around $1M last year to support Trump’s inauguration. 

Starting in the United States, Meta will end its independent third-party fact-checking program, which was launched in 2016 to address misinformation, particularly viral hoaxes.

“We made what we thought was the best and most reasonable choice at the time, which was to hand that responsibility over to independent fact-checking organisations. The program intended to have these independent experts give people more information about the things they see online. That’s not the way things played out, especially in the United States,” says Joel Kaplan, Chief Global Affairs Officer.

“Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact-check and how. Over time we ended up with too much content being fact-checked that people would understand to be legitimate political speech and debate,” adds Kaplan. 

Introducing Community Notes

Instead, the company is adopting a Community Notes system, similar to the one already in use on X (formerly Twitter).  

The new Community Notes system allows users to add context to posts, ensuring a range of viewpoints are included.

Meta will not write or manage these notes; instead, it will rely on contributors from different political perspectives to keep it fair.

Community Notes will initially launch in the U.S. over the next few months, with plans to refine and expand the system through user feedback.    

As they make changes, Meta will remove their fact-checking controls and stop lowering the visibility of fact-checked content.

Instead of using full-screen warnings that require users to click through, they will implement a simple label to inform people that more information is available for those who want it.

Meta has created complex content moderation systems to address various challenges, but these systems are limiting free expression. 

The company removes millions of posts daily, which is less than 1 per cent of all content. However, Meta admits that 10–20 per cent of these removals may be mistakes, leading to restricted speech and user frustration.

To address this, Meta plans to restore free expression on topics that are widely debated in mainstream discourse, such as immigration and gender identity, while continuing to enforce its policies strictly for illegal and high-severity violations like terrorism, child exploitation, fraud, and scams.

“We’re also going to change how we enforce our policies to reduce the kind of mistakes that account for the vast majority of the censorship on our platforms,” says Kaplan. 

For less severe policy violations, the organisation will depend on user reports before taking action. They currently demote content predicted to violate standards but plan to reduce these demotions and increase the confidence required to identify violations.

Tailored political content

Since 2021, Meta has reduced the presence of civic-focused content—posts about politics, elections, and social matters—across its feeds, based on user feedback requesting fewer such posts.

“We are going to start phasing this back into Facebook, Instagram, and Threads with a more personalised approach so that people who want to see more political content in their feeds can,” adds Kaplan. 

By analysing explicit social signals (e.g., posts users like) and implicit behaviors (e.g., how long users view posts), Meta will enable those interested in civic or political content to see more of it. Users will also have expanded control over how much political content appears in their feeds.  

” We are also going to recommend more political content based on these personalized signals and are expanding the options people have to control how much of this content they see,” he adds. 

Topics:

Follow us:

Vigneshwar Ravichandran

Vigneshwar has been a News Reporter at Silicon Canals since 2018. A seasoned technology journalist with almost a decade of experience, he covers the European startup ecosystem, from AI and Web3 to clean energy and health tech. Previously, he was a content producer and consumer product reviewer for leading Indian digital media, including NDTV, GizBot, and FoneArena. He graduated with a Bachelor's degree in Electronics and Instrumentation in Chennai and a Diploma in Broadcasting Journalism in New Delhi.

Partner eventsMore events

Current Month

28jan4:00 pm10:00 pmUnlocking operational efficiency with AIInsights for your future

Share to...