Meta has started it’s third party fact-checking program by moving to a Community Notes model, which was announced at the end of last week.
This will allow more speech by lifting restrictions on some topics that are part of mainstream discourse and focusing our enforcement on illegal and high-severity violations.
Which is a welcome change for all of us, as the previous model was repressive and worked more like a black box, deciding for billions of users what they could or couldn’t see.
Now we get to decide for ourselves, and contribute to the process. It’s a more open and democratic system.
When Meta launched its independent fact-checking program in 2016, it emphasized not wanting to act as arbiters of truth.
Instead, Meta chose to shift this responsibility to independent fact-checking organizations.
The goal was to provide people with more information about online content, particularly viral hoaxes, enabling them to make informed judgments about what they saw and read.
In the U.S., biases in fact-checking led to questionable decisions about what to review and how.
Legitimate political speech was often flagged, resulting in intrusive labels and reduced distribution. A program meant to inform sometimes became a tool for censorship.
“We are now changing this approach. We will end the current third party fact checking program in the United States and instead begin moving to a Community Notes program. We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see,” said Meta in their announcement. |