More from Thisispaper Magazine
 

When Content Moderation Hurts

foundation.mozilla.org

Numbers alone do little to illustrate the human impact of a bad content decision on the world’s biggest internet platforms. Whether it’s content that should be taken down, or content that was unjustly removed, seeking a clear explanation or reversal can be endlessly frustrating.

Content-focused regulation often privileges automation and filtering as a universal remedy for content moderation on large platforms.
Far too often, it leads to situations where ‘bad’ content remains online and ‘good’ content is taken down. Today, artificial intelligence is a prominent component of content moderation at scale, but it is a blunt tool when it comes to moderating public interest expression that must be understood within a particular human context.

Read: 10 min

Read more...
Linkedin

Want to receive more content like this in your inbox?