Meta is bringing its dedicated fact-checking program within the United States to a close shortly. According to a statement made on the social platform X by Joel Kaplan, Meta's global head of policy, the initiative will cease operations by Monday afternoon. This significant change means there will be an end to new fact checks being initiated and the formal role of dedicated fact-checkers will be phased out across the company's major platforms, including Facebook, Instagram, and its newer text-based app, Threads.https://x.com/joel_kaplan/status/1908204701457236472?s=46 The decision marks a pivotal shift in how the social media giant intends to handle the complex issue of content veracity on its services within the US market. Instead of relying on partnerships with third-party fact-checking organizations, Meta is transitioning towards a different approach. The company plans to implement a system referred to as 'Community Notes'. While specific details on the US rollout are emerging, this model typically relies on contributions from the user community to add context or flag potentially misleading information, similar to systems seen on other platforms like X (formerly Twitter). This move away from a centralized, expert-driven fact-checking system towards a more distributed, community-based model represents a fundamental change in Meta's content moderation philosophy. The original fact-checking program, established years ago, aimed to partner with independent organizations to review and rate the accuracy of content, often leading to reduced distribution for posts identified as false or misleading. The effectiveness and perceived neutrality of such programs have often been subjects of public and political debate. The transition to Community Notes could be driven by several factors. It might be seen as a more scalable solution to the sheer volume of content generated across Meta's platforms. It could also be an attempt to address criticisms about bias or censorship sometimes leveled against the previous fact-checking partnerships. By empowering the user community to add context, Meta may hope to foster a different kind of dialogue around disputed information, although the efficacy and potential pitfalls of crowdsourced moderation at such a massive scale remain to be seen. As Meta implements this new strategy, the focus will be on how effectively the Community Notes system can provide accurate and timely context without being overwhelmed or manipulated. The cessation of the formal fact-checking program signals a new chapter in the ongoing effort to manage information quality on some of the world's largest social networks, placing greater emphasis on user-generated input for context and clarification moving forward.