REMOVE OR REDUCE: DEMOTION, CONTENT MODERATION, AND HUMAN RIGHTS
Working paper available on SSRN
How should social media platforms respond to harmful or otherwise undesirable speech? For some content, platforms respond by removing it. But for other content, platforms allow the speech to be posted, but reduce its distribution throughout their networks so that fewer people see it. Despite its widespread use, this second tactic—demotion—remains normatively undertheorized. This article remedies that neglect by offering a framework for when platforms should demote content instead of removing it. After explaining why demotion policies by platforms stand in need of normative justification, it argues that demotion should be pursued instead of removal in cases where removal would violate principles of necessity and proportionality, yet demotion would satisfy them. By explaining when and why this is so, the article seeks to illuminate how necessity and proportionality principles-central to both political philosophy and human rights law-should be applied to the distinctive context of online content moderation.
Written by Jeffrey W. Howard and Beatriz Kira