added by sari · updated 2y ago
The Invisible Labor of Content Moderation
- As communities grow, so too does their need for moderation to ensure that content does not become toxic. In communities where moderation is vital, like group therapy, these solutions can augment human moderators and allow services and platforms to scale. AI-based moderation is rarely cost-effective for a platform to build in-house.
from Product-Led Communities Need Picks and Shovels by Stephen Wemple
sari added
- I think there’s a diff between “content moderation” and setting expectations for civil interactions. Like a platform finding ways to encourage ppl to be civil to each other isn’t censorship, to me (done well, they should barely notice you’re even doing it). And I think that DOES, or at least can, fall in the purview of what a platform should be tas... See more
from Notes by Nadia Asparouhova
sari added
moderation is, inherently, a subjective practice. Despite some people’s desire to have content moderation be more scientific and objective, that’s impossible. By definition, content moderation is always going to rely on judgment calls, and many of the judgment calls will end up in gray areas where lots of people’s opinions may differ greatly.
... See morefrom Masnick's Impossibility Theorem: Content Moderation At Scale Is Impossible To Do Well by Mike Masnick
sari added