1.5 Content Moderation

Quality control mechanism to maintain the health and veracity of content on the platform.

Philosophy

While content moderation may not be a flashy selling point, or clever marketing tool, it is paramount to the safety, utility, usability, and enjoyability of the platform, as well as the amenability & disposition of those on the platform. And the choice of moderation process has the uniquely direct influence of making the platform favorable to philanthropic attitudes/endeavors, and inhospitable to misanthropic ones.

While we strongly believe that the structure and purpose of the platform render it a subprime target for spammers, trolls, and provocateurs; it's a virtual guarantee that questionable content will make it onto the site. Rather than trying to shield our members from unpleasantness, we grant them the ability to confront it directly.

We anticipate that, whether through ignorance or malice, individual lapses in judgement will occur. But we also anticipate that, on the whole, the disposition and sensibility of our aggregate member base will remain sound. Therefore, instead of moderator farms, we crowdsource the functions of content moderation.

The direct outcomes of content moderation can result in removal of content and/or censure of Members. But the implications are greater, and the repercussions far further reaching than nearly any other feature of a platform. Happily, those same mechanisms put in place for content moderation can be used for decision-making generally on the platform (-more on that in a later section).

Examples

MOST would agree that its NOT okay to "kick puppies"

A permaculture guild can decide FOR ITSELF whether "carving lawn gnomes" is a relevant activity.

MOST can identify inquiries about "your car's extended warranty" as dubious.

Form and Function

Last updated