Content Moderation Can't Be Centralized

January 12, 2021

How can we introduce oversight into social media content moderation without unduly restricting free speech? We need to create systems of decentralized moderation that emphasize transparency and accountability.

The people who build and maintain platforms shouldn't be the same people who decide what content is legitimate.

If objectionable content is profitable, how can we trust the platform owners to make that decision? The conflict of interest is too strong. Many, if not all, of the moderators should come from outside big tech.

Ignoring the international nature of the problem, should this responsibility be passed on to our government? Recent Congressional hearings have shown that few lawmakers understand how social media works or even what their business model is. The government is ill-equipped, and operates at too slow a pace to keep up with oversight needs of content moderation.

Creating an all-powerful censorship authority isn't a desirable outcome and would be ripe for abuse. Instead, we need a decentralized moderation approach. Give users and communities control of what's promoted and what's banned. Involve as many parties in the process as possible.

Lastly, any moderation approach should be coupled with attempts to prevent the need for moderation in the first place. Build networks with systems that incentivize civil behavior over sensational content. Hold people accountable for their actions and reward positive speakers.

GithubInstagramTwitter

© 2020 Connor Daly