On August 5, 2025, a report highlighted the ongoing shift in Meta’s approach to content moderation on Facebook, following the company’s decision to replace its third-party fact-checking program with a Community Notes system. This transition, announced earlier in 2025, began with the termination of contracts with U.S.-based fact-checkers around March 1, after Meta CEO Mark Zuckerberg cited the need to reduce enforcement mistakes and prioritize free expression. The Community Notes program, inspired by a similar model on X, allows users to draft and rate notes to provide context to potentially misleading posts, with over 200,000 contributors signing up across Facebook, Instagram, and Threads since its testing phase started on March 18. A tech columnist’s experiment revealed that out of 65 drafted notes over four months, only three were published, indicating a less than 5% success rate, with topics ranging from political hoaxes to natural disaster misinformation. The program requires notes to be under 500 characters, include a supporting link, and gain agreement from diverse perspectives to be published, with no penalties affecting post distribution. Meta reported a 50% reduction in enforcement mistakes from Q4 2024 to Q1 2025, though the prevalence of violating content remained stable. The initiative is currently in its U.S. testing phase, with plans to expand to other countries, and includes six languages, aiming to scale moderation while reducing bias.
www.34news.online