Discord Pioneers New Educational, Rehabilitative Approach to Platform Moderation

Spread the love

I. Discord’s Gaming Roots Shape its Perspective on Toxic Behavior

At Discord’s San Francisco headquarters, the gaming roots that nurtured the platform are apparent, from arcade decor to employees competing in shooters. As Discord grew beyond gaming into music, education, and more, it retained insights gained moderating gaming’s notoriously toxic behavior.

If someone can invent a new harm, angry gamers likely already tested it out. So gaming communities act as petri dishes for understanding online toxicity’s evolution. Discord knows that stopping the next dangerous trend means understanding how it emerges and spreads from gaming first.

High-profile controversies like extremist propaganda and racist screeds circulating show moderation remains vital as Discord expands. But most issues are common ones like harassment and hate speech. Discord’s key users—50% aged 13-24, with many teenagers—also drive its interest in reforming rule-breakers.

II. Discord Critiques the Traditional ‘Three Strikes’ Model of Moderation

Discord concluded classic moderation policies like three strikes and bans don’t work well. They lack nuance and proportionality in penalties and don’t aim to rehabilitate. Teens posting harmful content may need help, not bans.

Most platforms lean on automated systems issuing strikes, reviews and bans without human oversight. While efficient, this treats minor and major offenses the same and doesn’t deter repeat issues. Discord knew it needed a more refined, behavioral approach.

III. Discord’s New System Focuses on Education and Proportionate Restrictions

Discord’s new warning system aims to educate users on violations through direct messages explaining the offense and policy details. Violations also now trigger restrictions proportionate to the issue, like limiting media uploads for posting gore.

Bans are a last resort, typically one-year rather than permanent. Instagram takes a similar educational approach, but Discord pushes rehabilitation further, believing most users can improve with guidance.

For teens especially, outright bans from social spaces feel too severe. Discord’s teen safety features like image blocking and alerts about strangers also aim to guide young users positively.

IV. Discord Considers Adapting Warnings for Servers Too

Discord is now exploring how to adapt its warnings model to rein in problematic servers. Defining accountability is complex, as some “moderators” may be inactive or unauthorized. Blanket bans feel excessive when owners are unaware of issues.

Analyzing metadata, owner actions, and user behavior appears the best way to identify truly troublesome servers for restrictions versus those needing education. There are no perfect solutions, but Discord keeps adapting.

V. A Promising Step Toward Reduced Punitive, More Educational Moderation

Discord’s moves toward reforming, not just removing, users offer a new model that other platforms could emulate. Rather than caricature trust and safety teams as censors, Discord sets an example of socially-minded innovation.

Approaching moderation as an opportunity to guide users positively, especially impressionable young people, could foster healthier online communities. Discord’s willingness to examine ineffective status quos and experiment deserves wide praise and attention. Their work progresses gradually, but reaps increasing wisdom.


Spread the love