Facebook is About to Get a Little Bit Less Censorious

by
Steve MacDonald

I wandered over to a different Grok, X’s AI, and found this headline. META Ditches Fact-Checkers for Community Notes.

Who doesn’t click on that? Here’s what it said.

Meta, the parent company of platforms like Facebook, Instagram, and Threads, has announced a significant shift in its approach to content moderation. The company is moving away from its third-party fact-checking program, which was established in 2016, to a new system called “Community Notes.” This model mirrors the one used by X (formerly Twitter), where users contribute notes to provide context to potentially misleading posts. Here’s the latest on this development:

  • End of Fact-Checking Program: Meta has decided to end its program with independent third-party fact-checkers due to concerns over political bias and the volume of content being fact-checked. This change starts in the U.S. and aims to reduce perceived bias in content moderation.
  • Adoption of Community Notes: The new system will rely on crowdsourced contributions from users to flag and add context to posts. These notes require consensus from users with diverse perspectives before becoming visible, aiming to prevent biased ratings. This approach is expected to empower the community to manage misinformation directly.
  • Policy Shifts: Alongside this, Meta plans to lift restrictions on some topics, focusing enforcement on “illegal and high-severity violations” like terrorism and child exploitation. The policy change is part of a broader move to “restore free expression” on its platforms, reflecting a cultural shift towards prioritizing speech, especially after recent U.S. elections.
  • Implementation and Reaction: The rollout of the Community Notes will begin in the U.S. over the next few months. There’s a mixed reaction to this change; critics are concerned that removing professional fact-checkers might increase misinformation, while supporters see it as a step towards more transparency and user control. Posts on X reflect this debate, with some users questioning the effectiveness of community notes in combating misinformation compared to professional fact-checking.
  • Broader Context: Mark Zuckerberg has cited the complexity of previous moderation systems leading to “too many mistakes” and “too much censorship.” He also noted moving the trust and safety team from California to Texas to potentially reduce bias in team operations.

This shift by Meta represents a significant pivot in how social media platforms approach content moderation, emphasizing community involvement over centralized fact-checking. However, the effectiveness and impact of this new system on misinformation and platform discourse remain to be seen as it is implemented and evolves.

Multiple Media reports confirm this as accurate.

As many of our readers have been banned, thrown in Facebook Jail, and throttled by Meta, we look forward to evaluating how this new approach affects the current suppression of content, sharing, and debate.

We used to get an obscene amount of traffic from Facebook. I’d be happy to see even some of that return. But there is the other censor still lurking. Facebook has pulled down multiple posts, claiming they were spam. It claimed that if you followed the link, you had to click on something else to get to the content. This is suspicious or just not right.

We’ve never done that, but we get takedowns for it, so I’m betting that will still be an issue. Whether it is used in place of the partisan fact-checking process to silence some speech – and to what degree- remains to be seen. And we will be watching.

Author

  • Steve MacDonald

    Steve is a long-time New Hampshire resident, blogger, and a member of the Board of directors of The 603 Alliance. He is the owner of Grok Media LLC and the Managing Editor of GraniteGrok.com, a former board member of the Republican Liberty Caucus of New Hampshire, and a past contributor to the Franklin Center for Public Policy.

Share to...