added by sari · updated 2y ago
The Invisible Labor of Content Moderation
- Moderation is hard. It’s time and resource-intensive, and it’s unclear what the standards should be
from The Invisible Labor of Content Moderation by Gaby Goldberg
sari added 3y ago
- Effectively moderating a social computing system isn’t just the right thing to do from an ethical standpoint — it’s in a platform’s best interest. Moderation shapes the platform: as a tool, as an institution for discussion, and as a cultural phenomenon.
from The Invisible Labor of Content Moderation by Gaby Goldberg
sari added 3y ago
- There seem to be three “imperfect solutions:” paid moderation, community moderation, and algorithmic moderation.
from The Invisible Labor of Content Moderation by Gaby Goldberg
sari added 3y ago
- Content moderation is how platforms shape user participation into a deliverable experience. Social computing systems moderate (through removal and suspension), recommend (through news feeds and personalized suggestions), and curate (through featured, front-page content). These dynamic decisions fine-tune the way we, as users, experience different o... See more
from The Invisible Labor of Content Moderation by Gaby Goldberg
sari added 3y ago
- By law, they don’t have to allow all speech: the safe harbor provision (found in Section 230 of U.S. telecommunication law) grants platforms with user-generated content the right, but not the responsibility, to restrict certain kinds of speech.
from The Invisible Labor of Content Moderation by Gaby Goldberg
sari added 3y ago