I don’t think Ryan Moulton necessarily wanted the bulk of his explainer on content moderation to read like an apologia for Twitter, so I’m mostly going to skip to some stuff at the end, which addresses that part of the problem with modern, mainstream social media platforms is their sheer size.

Give communities the tools to moderate themselves at scale. Give users (or advertisers) the tools to control their own experience at scale. It may be impossible to teach someone with specificity what is and isn’t doxxing or what the early signs of getting targeted by the alt-right are, but once you’ve experienced them, or watched it happen to someone, you’ll never forget. […] Know that perfect content moderation is unattainable, and your users cannot wait for you to improve. They need the tools now, and they will be better protecting themselves than anything you develop as an outsider.

Which basically is what I’ve been getting at when pushing Twitter to let users automate author-moderated replies, or even actively consider implementing a kind of Mastodon-like internal federation of discrete, self-assembled communities.

I do think that social media relies too heavily upon algorithmic moderation, but at some point a platform simply gets too large to engage in moderation purely by human hand. At that point, the only option, maybe, is to break up the monopoly by additionally giving users the tools to organize themsleves.