Saved by sari
Why social media can’t keep moderating content in the shadows
We are living through an epidemic of mistrust, particularly here in the United States. Trust in social media and traditional media is at an all-time low. Trust in the U.S. federal government to handle problems is at a near-record low. Trust in the U.S.’s major institutions is within 2 percentage points of the all-time low. The consequences are... See more
Chris Best • Society has a trust problem. More censorship will only make it worse.
The defining difference between web 1.0 and the platforms that dominate today is not technological sophistication but moral architecture. Early online communities were transparent about process and purpose. They exposed how information was created, corrected and shared. That visibility generated accountability. People could see how the system... See more
Link
platforms should provide:
- Social provenance transparency : Label which communities are engaging with content. For instance, a viral post could display whether it resonates across diverse groups ( shared ground ) or primarily within a specific niche ( different perspectives ).
- Community-driven curation : Empower communities to promote content they
Building A Prosocial Media Ecosystem | NOEMA
But I don’t think platform companies should—or can—act as the governance layer of our online lives. In this, I share tech journalist and Bluesky board member Mike Masnick’s informed belief that global centralized moderation is a dead end. And if centralized moderation isn’t a plausible way forward, then experimentation toward genuinely... See more