Keeping Mastodon non-toxic & benevolent requires total community buy-in when it comes to moderation.

Yes, we have mods + admins. But we must also habitually practice individual moderation using the robust safety tools at our disposal.

This means blocking clear-cut trashbags AND reporting them to admins

This means also reporting trashbags to the trashbag's own home instance admins.

It means calling out bad actors early and often & sharing information via #FediBlocks & boosts.

The threat of defederation is HUGE leverage to an individual member. It gives agency to us & our entire community.

If an admin receives confirmed reports on the same trashbag in their instance from multiple members - esp members or admins from other instances - that massively sways their cost/benefit and risk analysis.

Is it worth an admin to keep a stone cold trashbag versus their entire instance losing access to other servers? Usually not.

How a moderator or instance addresses trashbags is ultra-instructive. It tells me how they or their instance and their members may view moderation, aka my safety, aka my own rights to exist as a human.

If someone serves me mumbling, shrugging indifference when called out re: permitting racists, fascists, TERF's, trolls, rogue channers into their instance - even *hypothetically* - then that makes the math easy. They aren't serious about moderation and I can block/report/warn accordingly.

But this only works if individual members meet moderation at least halfway on Mastodon.

This site isn't free and safe and ethical and inclusive spaces aren't the default setting. It's like a community garden. It takes constant watering, feeding, work, and care by all members collectively to maintain healthy spaces.

Doing nothing lets the weeds grow and the roaches infest and the space becomes fallow. (see also: Twitter and their "you do you" lack of moderation)

I've spent a lot of time as a moderator in other social spaces (shout out to defunct trans chat rooms and Geocities and Yahoo groups lol).

I spent 10+ years on Twitter self-moderating where doing so was antithetical to its algorithm and ethos. (It's why their safety tools suck; it's bad for the algo.)

I've thought about and beat the drum endlessly on moderation. It has to be built-in FIRST. Not case-by-case or retroactively.

Letting even one trashbag in a safe space can be a tipping point.

Being proactive in one's moderation to keep this space inclusive & safe for all demonstrates empathy & care for yr fellow Mastodonians.

But you can't be empathetic and caring and you aren't being an ally to your mates if you're sticking w/passive "you do you" moderation, or if you have a "welp can't see it from my instance" approach when you hear about a trashbag incident.

It's different here than other social spaces, & we must act differently here.

@mxtiffanyleigh thank you! moderation is absolutely a core element of community building and nurturing healthy platforms.

btw, for anyone who might know: since I'm currently easing into hosting a server of my own, I'm still a bit overwhelmed with the admin-side technology involved. Are there any tools with which I can import a block list? Preferably with a UI?

Manually delisting bad actors already looks like a several hour job copying and pasting the most obvious public recommendations.

@mxtiffanyleigh btw, @triketora might know more about tools to facilitate that sort of moderation in federation.

I don't know if Block Party will expand to this platform. Tracy, is that (or a discussion on moderation tools and policies) even something you care to continue here? Cheers!