Facebook may not be the biggest site on the planet, but it’s certainly the most connected. It’s also the biggest place for user-generated content on the web. The problem with user-generated content is that it needs to be moderated in one form or another. But with 350 million users, it’s easy to understand the amount of work necessary to weed out the objectionable content. In order to help out with the gargantuan task, Facebook is turning to its most valuable asset, the users, and the social network is now testing a crowd-sourced model of content moderation.
The social network has created the Facebook Community Council, which is currently in private testing, a feature which enables users to rate content reported by others as potentially objectionable by placing it into one of the eight categories available like, Spam, Drugs, Attacking (“direct attacks against non public figures”), Acceptable and so on.
“The Facebook Community Council is a way for users to tell us whether reported content violates our policies. We’ve found that people aren’t shy about reporting content they come across that looks suspicious, and this is just another way of leveraging the Facebook community to help maintain the site’s trusted environment. It’s still in an experimental sta… (read more)