The thing about whales - quite aside from the dodgy ethics of having one or two people be basically financially responsible for the whole site - is that if you rely on whales, your income's gonna be incredibly inconsistent and you're gonna have a nightmare of a time trying to budget both your business and your life.

Improbable Island's donation setup tries hard to discourage whales and encourage instead small, manageable, regular donations from lots of people.

(there are a limited number of things you can buy, and once you've bought them all, there's no personal benefit to donating more. However, every time someone donates, everyone else gets tangible benefits.)
You'd think this would be an ideal setup for Patreon, but tbqh outside of the top handful of high rollers on that platform I've yet to see any benefit from it. I already took PayPal, and adding Patreon support just meant paying Patreon an additional cut (over and above the cut that PayPal already took!) for the dubious privilege of using their crappy API. The end result was that a handful of players switched to Patreon and now I get slightly less money from them.

No matter how hard you try to set things up so that each month is more or less the same money as last month, you're probably gonna be in Feast-Or-Famine mode. If you're used to self-employment then you're used to this, but it can still be scary, especially if you have a financial emergency like a suddenly-sick pet or car problems combined with a bad month. It can make you think "Aw man, am I ever gonna have a good month again?"

And yeah, you will, but in the moment it's hard to remember that.

Anyway, it's been a stinker of a month so far and I've got a cat whose kidneys are starting to pack in, so anyone who wants to boost https://www.improbableisland.com please do so. :)
Improbable Island

The weirdest old-school text adventure on the internet.

Another one for the community moderation thread: DEATH.

We just lost an Islander to COVID-19. I had to knock out an MotD in a hurry to get money and messages of condolence to his widow:
https://www.improbableisland.com/motd.php?id=509

It's very late and I'm getting up very early and I'm emotionally exhausted from writing this, so there'll be more on this subject tomorrow.

For now, think on it, because this will happen to you.

Improbable Island Message of the Day (MoTD)

Alright real quick thought on this because it just popped in there and grief does weird things to us:

Remember that bit towards the start of RoboCop where the chief comes in and takes the dude's name off the locker, says when the funeral is and donations to the family can be given to Cecil as usual? For a start, you're the chief, and you're Cecil too, so that's two jobs right out the damn gate

Alright, I'm awake, too damn early, and online death.

On any community website, people will die. When you, the admin, learn of one, it's because they were so heavily into the site that their spouse or whoever reached out to you or someone on the site to say hey this thing meant a lot to this person. That's vanishingly rare and for every one of those people there are scores who appear to just stop logging on, and are indistinguishable from people who just stop logging on but who are alive.

Just as we don't have the cultural DNA to be able to handle living on the internet, which is a huge part of our lives despite being brand heckin' new, we're even less prepared for people on the internet dying. The internet is so new, and death online so rarely noticed, that there's no real cultural consensus on how to process it. We have not yet developed rituals and stories and processes to help us cope with online death, so our range of reactions can be even more wildly varying.

Specific to our game, when a modulator died some years ago, we organized a walkabout through her places (player-created buildings) to find that she had left her character in a bed.

Seeing her name on a computer screen sent adrenaline down my spine and I legit felt like I'd seen a ghost. I reacted to seeing her name, in this context, the way I'd react to seeing a ghost, because although I knew her very well, I knew her *as a name.* I saw her *name* in exactly the context that I knew *her.*

We hear about moments like these, like an app posting on someone's behalf about some dumb game on Facebook or whatever, and although the following emotions can be anger or sadness or comfort or even becoming a part of a ritual of rememberance, the initial emotion in the moment tends to be shock. People literally say "I felt like I'd seen a ghost," because online we're halfway ghosts anyway.
So I remembered this last night, and I warned people that the player's character was still standing in their place. I didn't remove them, because some people would want me to and some would not want me to and the emotions are conflicting and wildly varying (because like I say no cultural consensus on how to handle death online and no stories to give is frameworks for how to grieve), but there's never anything wrong with warning folks so they can prepare.

Other things to keep in mind when dealing with the death of an online community member:

Yes, post a public message. You'll have complex admin-only feelings about the member that do not apply to the general userbase; keep those to yourself unless directed otherwise by the player's friends, because this isn't about you. Search "Comfort in, dump out" for info on that.

(our player's MotD, his friends told me, should be to GET PEOPLE THE HELL VACCINATED)

The family may ask you for kind words from the userbase, or set up a funeral fund.

The member's grieving family see your site as a source of happiness for the deceased member, and want to include it. The grieving family is not thinking, right now, of the nazi troll you banned last week.

The funeral home will set up a memorial page with all sorts of personal info. Don't link to it.

Tell your members to send kind words and money to *you,* and you'll pass them along.

DON'T DOXX THE WIDOW.

Everyone but the very young has known death. Traumatizing and awful as it can often be, it's a common experience for which we have a script to follow. *Online* death, or rather *noticed* online death, is much newer, people don't know how to cope, and there'll be kind of a mess. Be extra gentle with your users for the next little while.

Also be extra gentle with yourself, and try to resist the temptation to view the deceased as the most visible tip of an iceberg of unknown ghosts.

Online Community Moderation Thread Part NaN: backseat moderation

You've gotta have a thing in your CoC that prohibits talking on behalf of mods, IE people going around saying "Careful, the mods don't like that."

When people say that, they usually mean "I don't like it but I'm gonna shift the fallout of this social sanction away from Cool Free Speech Guy Who Doesn't Care here, and onto Evil Site Staff Who Are Literally Hitler."

Never let anyone speak for you or your mods.

Adding this on to the community moderation thread:

Someone at Stern Pinball got talked into making a pinball discussion platform to go with their new high-scores app. The forum isn't live yet but it's 100% going to be an absolute disaster and I Can Not Wait.

Pinball specifically suffers from several of the dynamics I've talked about in this thread, and having a manufacturer start their own forum about their own games is also a uniquely awful idea so get your drama-slurping straws ready

Pinball is a Rich White Old hobby. Ten years ago it was possible to buy pinball machines for fairly cheap, but now the used pinball machine market is full of predatory parasitic investors - that is, blokes with too much money who buy any machine under three grand sight unseen with no intention of playing it, just sticking it into storage and waiting for the price to go yet higher. Obviously there is Drama about this, and about how big the bubble can swell before it pops.

Pinball is currently going through a slow but steady revival, which means new people are coming into the hobby.

Some of them are young.

Some of them are *women* for god's sake.

I've even heard that one or two of them might be... "you know..."

So pinball has been undergoing the sort of drama that happens when a hobby has gone a long time without any new blood, and the old guard are stubborn - and even the ones who aren't resistant to change are slow to adapt.

The biggest pinball forum is Pinside, and it's a horrible, horrible place. It's less horrible than the newsgroup that came before it, but it's still horrible. There's also Tilt Forums, which can be horrible sometimes but for pinball it's comparatively unhorrible.

The pinball community has been so poorly served that they're primed to expect online pinball discussions *in general* to be some flavour of horrible. Standards and expectations are *very* low going in.

So pinball is a rich old white dude hobby struggling with gatekeeping and racism and sexism with no real way out of the cycle (save for "route your beaters," any pinball people reading this - not every game has to be restored to mint condition, put your sheds out at a quarter or 50c a play and let the working classes have a go for a change), and into this whole Situation comes Stern Pinball, the epitome of Rich Old White Dude companies.

Stern in particular is well-known for erasing threads on their Facebook page that veer into territory even slightly critical (such as asking for code updates to abandoned games). Combine this with the sense of entitlement you often see in rich people, and you've got a recipe for a 1:1 ratio of Pinball/First Amendment threads.

This forum will be a case study unto itself!

Earlier on I talked about how targeted advertising is a scam from flim-flam men, and your best bet was to go for completely untargeted and really wide-reach advertising to find the people who don't yet know (or act like) they want your thing.

This article says that companies who turned off targeted ads don't notice any difference, and nearly all of the people who were targeted would have bought the thing anyway:
https://sparktoro.com/blog/what-if-performance-advertising-is-just-an-analytics-scam/

What if Performance Advertising is Just an Analytics Scam? - SparkToro

“What the pandemic showed is we can take marketing down to zero and still have 95% of the same traffic as the year before. So we’re not going to forget

SparkToro

I saw someone had posted this link on Fedi earlier but I forget who it was. I remember coming across this site five years or so ago and absolutely inhaling it. In the time since I last had a look it's been updated:

http://www.issendai.com/psychology/estrangement/index.html

It's a study of forums for estranged parents - that is, forums full of people so horrible that their own kids disown them.

Down the Rabbit Hole: The world of estranged parents' forums | Issendai.com

The members of estranged parents' forums claim their children cut them off for no reason, but their own postings say otherwise.

I'm posting this to the community moderation thread because the behaviour patterns of these sorts of estranged parents mesh so perfectly with the sorts of abusive behaviours that you have to warn your community members about.

This series of articles is ostensibly about parental abuse, but much of the content applies equally well to most emotional abusers.

Particularly useful for online community modulation is this list of dysfunctional beliefs at http://www.issendai.com/psychology/estrangement/dysfunctional-beliefs.html - here's a sampling:

* If one understands something, then one agrees with it. If I don’t agree with something, then I don’t understand it. If you don’t agree with me, then you don’t understand me, and can’t claim that you understand me until you agree with me.

* Emotions cause actions. When I feel something, I can’t not act on it.

Dysfunctional Beliefs That Are Common in Estranged Parents' Forums | Issendai.com

Estranged parents' forums encourage a plethora of beliefs that guarantee parents will have difficult relationships with their children.

Heck I kinda wanna make a list of dysfunctional beliefs that people have about participating in online communities. :)

Dysfunctional ideas about interacting in online spaces, ban if you see evidence of folks thinking this way:

* This website won't survive if I leave it.

* This website's norms don't work for me, so I will try to change those norms rather than fitting in or going to a different website.

* If a website's rule is insufficiently specific, then the important thing is to break it, or nearly break it, so that the admin will make it more specific.

* This website owes me for the time I spend on it.

* Power corrupts, so the moderators of a hobby website should be treated with the same disdain or distrust one would treat a millionaire politician or CEO.

* Moderators only become moderators because they want power over others.

Oof, sad news about Something Awful founder Lowtax.

Something Awful has been going through a messy and painful transformation, a kind of reckoning with its past self, and at some point I was gonna do a case study for this thread.

I've only just heard about Lowtax's suicide so it's probably not a great time to start that analysis, but in the moment this feels like a cautionary tale about deleting your old stuff so you can change.

Community Moderation Thread continued, a case study:

@[email protected] shows us the eventual end state of the hobby degradation dynamic I talked about earlier in this thread.

https://social.bau-ha.us/@aurora/107434889581265192

This starts with admins allowing forums to shift away from normal, everyday conversation about a hobby, and towards threads where people post pictures of the thing they bought today. Further in Aurora's thread are some counterexamples of still-viable groups.

ava vs. the universe ✨ (@[email protected])

trying to sell my old hifi setup and realizing that the stuff i bought for 1-10€ a piece on ebay 15 years ago is now worth over 500€ :blobfoxeyes:

[email protected]

All hobby communities are vulnerable to consumerist takeover, and the effects can spill out into the real world as we've seen here, inflating prices and cutting off new members apart from the very rich, cementing a self-reinforcing mechanism.

There is no saving a hobby community that has entered this downward spiral. Once a hobby becomes involved with financial speculation, it's a rich-getting-richer wasteland until the bubble bursts, which can take years.

Stopping a hobby from becoming taken over by the empty content of the rich is easy, but requires vigilance and community buy-in.

Establish in your CoC that posts amounting to no more than "Look at this thing I bought today" are spam, and will be treated as such. Talk about the hobby degradation phenomenon in your CoC so that people understand why it's a necessary rule; your members will help with enforcement if they're familiar with the alternative.

Moar online community moderation thread!

A browse through reddit's "hobbydrama" forum often yields cautionary tales that can illustrate What Not To Do, and here's a good write-up of Neopets' infiltration by NFT scammers:

https://old.reddit.com/r/HobbyDrama/comments/pzmcy2/pet_site_game_neopets_introduces_nfts_burns/

The bit that caught my eye, the bit that made me think this belongs in the moderation thread rather than web3isgoinggreat or wherever, is the language the scammers use.

[Pet Site Game] Neopet's introduces NFTs, burns itself (and it's goodwill) to the ground

Many of you are probably at least vaguely familiar with [Neopets.com](https://Neopets.com), one of the biggest browser games of its era and the...

reddit

In this writeup we see Neopets invaded by NFTrolls who have spent so much time sniffing their own farts that it doesn't occur to them to code-switch; they chuck around 4chan words like "oldf**" thinking this is just how people talk.

Unmoderated, anonymous websites (here I say "anonymous" to mean places where you don't have strong visual differentiation between users) ruin your brain.

(NFTs destroy communities too, but y'all already knew that)

In places where the users are difficult to tell apart, and especially in places that attach numeric scores to socialization, people end up talking the same way. Heck, go browse that subreddit I just linked to, ordered by top; the posts all have the same rhythm, same style, same slang, even though they were written by allegedly different people.

Imagine a party full of people who are so close they talk alike, but aren't friends and don't actually know each other. WEIRD AND SCARY.

This is of course deliberate!

Everyone here knows that spyware companies invest billions into improving their programs to better spy on people and try to predict what people are gonna want to buy. Most people here know that targeted advertising doesn't actually work and it's all just a long con, but the folk who work at spyware companies like google and facebook etc have been - YES! - sniffing their own farts for so long that they're starting to honestly believe their own nonsense!

So when you've chucked billions towards paying some brogrammers to try and predict the behaviour of individual humans and still the best your program can do is show them adverts for a toothbrush they bought last week, if you're particularly sociopathic you might look at the other side of the equation:

Your program might give accurate guesses more often if the people it was spying on were easier to predict.

That's where we're at now: spyware companies have, after decades of trying, finally invented a square-shaped hole, and realised that it'd take many further billions to make that hole sufficiently people-shaped to actually work; now they reason it's cheaper to make a bigger hammer.

Hence facebook's reaction emojis; it's WAY easier to have the product choose from five emotional reactions than to try and parse emotion from a textual comment.

This is why I'll keep circling back to how important it is to allow your users to differentiate themselves visually. Let them upload avatars, change their text colour, choose from different CSS for their profile pages. This fights the homogeneity that spyware companies crave so much, and since so much of people's interactions with computers these days is through spyware, it'll feel to your users like a breath of fresh air.

https://rixx.de/blog/on-running-a-mastodon-instance/

Adding on to my massive long online-community-management thread: Here's a great post from the admin of chaos.social on his experiences running a Mastodon instance along with @leah. There's overlap with running any kind of online community, but federated stuff has its own specific quirks that @rixx highlights nicely in this blog post. A worthwhile read if you're thinking of setting up a Mastodon server or any online space.

On Running a Mastodon Instance

I've been running chaos.social for nearly 5 years. A reflection.

rixx.de
Another addition to the online community moderation thread, in which @eldang, a retired Fediverse mod, tells their story:
https://weirder.earth/@eldang/108211095983989977
Eldan Goldenberg (@[email protected])

Content warning: Moderation philosophy, from a retired mod

weirder.earth

Meta bit in this thread: Elon Musk just bought twitter, so we may be about to witness what happens when a formerly-badly-moderated site deliberately turns off moderation.

We've seen this before loads of times, and it's predictable - the site fills up with toxic people who scare off first the normies and then each other and it collapses in on itself within months. But I don't believe we've ever seen it happen with a website as big as twitter. This is gonna be fascinating/horrifying to watch.

Here on Fedi we're also gonna get ourselves a big ol' dose of No Fountain, but I see @feditips and others REALLY GOING HARD on the "Write down and broadcast the unwritten social norms for preservation" thing, and I think fedi reminds people of forums and BBSes enough that they're remembering netiquette and dramabombs and site implosions from their own pasts, and taking measures to get the newbies thoroughly doused in Fedi Culture really quickly.

Fedi is very cool in a lot of ways

Big Long Online Community Moderation Thread time? Yes!

Had this conversation again:

Player: "Dan, can you make it so we can block people on Improbable Island?"
Me: "Why, who'd you wanna block?"
Player: "Oh this one jackass, he's been..." *very detailed description of subtly shady behaviour that would've flown right under the radar if they'd just blocked the jackass*
Me: *bans the jackass before they try it on someone else*

Should your website have a Block button? Still probably yeah

But this actually happens quite a lot - behaviour that wouldn't have been documented if a person can just click a button, especially the kind that tends to not stand out until after enough people have been hurt for a whisper network to form. Having the "Yeah it's on the list but it's a technical nightmare because of this ancient codebase, who d'you wanna block anyway" convo opens people up to actually reporting stuff that feels hinky.
Admins, always remember: people generally don't report abuse!
(adding block functionality is legit on the list. It's just ancient bloody rat-wiring code and a very complex game and it's the sort of project that could literally take months - and in the meantime whenever someone brings it up there's an opportunity to catch an under-the-radar abuser that I wouldn't otherwise have had. These dudes are sneaky.)

This article shows such a clear picture of how internet trolls are parasitic:
https://www.reuters.com/investigates/special-report/usa-trump-truth-social/

Trolls can't operate without access to an audience that someone else has built, because they're too unlikeable to build their own. They want to leech off YOUR audience, and understand "free speech" to mean "free web hosting and a free, pre-built audience."

Politics trumps business in Truth Social’s war on Big Tech

The firebrand former President Trump’s social media firm has struggled to build a competitive platform. One big reason: It has alienated tech talent and corporate partners in the left-leaning industry it has vilified.

Reuters

I just saw a twitter screenshot where someone used the slur that rhymes with "maggot," and hoo boy holy shit I knew twitter was bad but this is 101-level stuff they've neglected here.

Even the most amateurish PHPBB forum in 2001 knew that there are slurs that you shouldn't allow to be posted on your website because they can't be used for anything constructive or useful. Seriously twitter this is absolutely rudimentary stuff.

Quote from a toot posted elsewhere, regarding blocklists:

"How is this list to be regulated? By the number of votes? What if 99% of the submissions agree to ban a certain religion, or vegans, or economists who wear yellow shirts on Tuesday?"

This is a type of user you should ban straight away without engaging. Every single time I've seen this sort of post, it's from someone who gets banned from places a lot for being exhausting.

You'll come across people who are generally leery of moderation in general; some of them have genuinely had the experience of being in a community that imploded because of bad mods (always on someone else's website; facebook groups, subreddits, anywhere where you can start a community with a couple of clicks), but 95%+ of the time it's because they, personally, keep getting banned from places.

Nice little post here to add to this thread:
https://thagomizer.com/blog/2017/09/29/we-don-t-do-that-here.html

The "here" part of "we don't do that here" has special Culture Juice in it. You're not trying to change the whole world, that's hard; you're trying to make a nice little space in a website, and people can fit into the culture or they can not and go somewhere else instead. You're just trying to make a nice little bit of positive culture. Yoghurt pot sized like.

Remember earlier in this thread I talked about dramaslurping? Get your straws out and mosquito your way over to this deliciously festering puddle of How The Hell Did We Let This Get So Bad
https://cohost.org/staff/post/124903-community-guidelines

The gist: people were posting drawn child porn on cohost, and rather than banning those people immediately, cohost wrote a very long post about "working to implement a system to allow us to get user input on this area of policy"

Community Guidelines Update

hey folks, we’ve got a couple big trust and safety updates coming today, including some changes to the community guidelines [https://cohost.org/rc/content/community-guidelines]. we wanted to go over everything here for transparency about what we’re doing and why. first off, the community guidelines. we’ve gotten a lot of questions and reports on content that, while we considered to be borderline but permitted, was absolutely in a gray area in the written community guidelines. we had internally developed a set of policies that we were applying to the small number of cases that came up, but had not publicly announced the policies we were applying because of some open questions we still had. this was a bad call, and moving forward we’re going to be more transparent about areas of uncertainty and indecision in our policy. here’s a summary of the changes: * we’ve added clarifications to the section regarding child sexual exploitation material, and how it pertains to non-realistic depictions of minors, in an attempt to provide clarity and consistency for enforcement. * internally, we had been drawing the line at the prevailing legal definition of “realistic depictions,” which includes photographs/videos of actual human minors, or content difficult to distinguish from actual photographs/videos. * policy around non-realistic depictions, such as lolicon/shotacon, has not yet been finalized. we don’t want to implement a policy that the majority of users would feel uncomfortable with. we are currently working to implement a system to allow us to get user input on this area of policy. until such time, please refrain from posting it; up to this point, we have been asking people posting it to remove it pending a final policy decision. * we’ve added a new section clarifying and adding new rules around content warnings. * previously, content warnings were only strongly recommended for posts containing potentially sensitive content. in most cases, this is still true. however, we are now requiring CWs for certain types of content. * this policy change is accompanied by a technical change that prevents these CWs from showing up in unrelated tag pages. these posts will still show up on your dashboard (if you are following the poster), profile pages, and tag searches for any of the terms on the list. * the full list of mandatory content warnings can be found on our support site [https://help.antisoftware.club/support/solutions/articles/62000226150-mandatory-content-warnings]. this page is also linked from the community guidelines. * repeated failure to add mandatory content warnings, as well as attempts to circumvent the filtering system (such as by using numbers or symbols in place of letters), are considered bannable offenses. we don’t want to ban you so please be normal about this. the tag page change is live now. our motivation in this change is not to censor any types of allowed content, but to prevent certain types of sensitive content from showing up in large, more general tags. while we may make changes to this list in the future, all changes will come with a notice, as well as a grace period for users to start adding CWs to their posts. our goal is to provide a robust set of tools that allow everyone to customize their own experience to their level of comfort and safety. to support this, we are actively working on a system with which you will be able to completely hide posts that include CWs you never want to see and skip the clickthrough on CWs you do not need a warning for. these tools are being worked on in addition to general tag filtering tools. above all, we believe that you know your own preferences, limits, and triggers better than anyone else; our intent with these changes is to help you see the posts you want to see and none of what you don’t. we also want to clarify that, thus far, we have not received any reports for content that, under the new rules, would require a mandatory content warning but did not already have one. we really appreciate that people are using the content warning system correctly, even before we had rules in place. the purpose of these rules isn’t to change anyone’s behavior, but to codify behavior we already saw, as well as to make our job moderating easier. we are, as always, open to feedback on these policy and technical changes. this is a tricky, sensitive area to work in, and we’re making sure to act deliberately and with consideration. this is not a sudden decision; we have been thinking over these changes for well over a month now. (related: having weekly hours long conversations with your coworkers about lolicon kind of sucks and we would recommend against being in a position where that’s necessary.) that’s all for now. please let us know if you have any questions or feedback and, as always, thanks for using cohost!

cohost dot org on cohost

They then solicited comments about whether to allow drawn child porn - and the sort of abusers* who draw child porn - to be on their website (which of course would then turn this website into The Loli Website until it got disabled by its host/registrar), or whether that'd be 😰CENSORSHIP😰

The thread is predictably full of Very Too Much Online people

(* over nearly 15 years and a quarter of a million players, not one person on my game who ever even talked about loli was not a serial abuser)

Look, there's a lot of nuance in online community management. There are very few easy black-or-white decisions. Most of the decisions you make over what to delete/ban will be difficult, agonizing even, and will end up with you getting yelled at.

This isn't one of those. This is forehead-slappingly obvious, and cohost managed somehow to not just dither over whether to be The Nonce Site, but do it *publicly.* That should be a big red flag for anyone thinking of being associated with them.

Comments on cohost and a comment a new fediverse admin in their early 20's left on this thread reminded me that there's a very specific failure state that some Very Online people get into when they're admining a website, and that is to mix up government with website operation.

Websites aren't countries, they're Literally Just Websites. Your users can leave and go to one of the over 300 other websites on the internet, whenever they like.

(if they can't, your website shouldn't exist)

If you're seriously feeling the need to run a website to the same standard as an imagined Ideal Country, then you're gonna have a Really Bad Time, because the two have absolutely nothing in common.

It's not necessary to have convoluted discussions about censorship when the people who want to look at the thing you're about to ban can still look at it by typing in a different web address.

You're not a government. You just run one website, out of many.

(actually I just checked and there are over FOUR hundred websites now)
Like, go read that cohost thread. That's an awful lot of people who are discussing the question of "What is and is not morally right" instead of "What would be good to put on this website vs what should go on a different website somewhere else," and this is an EXTREMELY common mixup!
@ifixcoinops need to reread this thread again soon because the sheer amount of hard-won admin experience in it is dense. Thanks for being one of few people who have done it for a long time without becoming jaded, burnt out & disengaged.
@ifixcoinops cohost is a fascinating case study because I saw them start and make great claims about being well-moderated and centralised but all with that not-yet-unwrapped sheen of “nobody would really be bad here” idealism. I have to go back over the events there and maybe do a write up.
@ifixcoinops and yes, I fully admit the glass house I’m throwing stones from. But in my case, I very intentionally have avoided scale, because I want to build my experience at a rate that’s faster than building my responsibility (userbase). Scaling fast is the enemy.

@s0 Ah, idealism. I do miss those days. :P

Like, one of the staff made a post about how they really wanted to ban a guy for being an arsehole but being an arsehole wasn't against the rules and they weren't specifically breaking any rules. Like, yeah my guy, 90% of the people you have to ban will be like that. The ones who you ban for a clear and unambiguous rules violation will be a tiny minority!

Programmers making rules figure it's all if-this-then-that simple

@ifixcoinops
> "How is this list to be regulated? By the number of votes? What if 99% of the submissions agree to ban a certain religion, or vegans, or economists who wear yellow shirts on Tuesday?"

I am this type of person to ask “how is this list to be regulated”, and you can check my mrf reject list to see that I'm hardly scared of banning instances/people.

https://pleroma.cafkafk.com/about

I also totally agree with and moderate “subtle” examples of bad behavior or abuse.

I however feel that you are creating a strawman of people that demand some sort of due process, which is a bit unfair.

To me, it's still paramount that there is SOME evidence. It can be just a screenshot of the conversation and an explanation that makes it obvious.

But I think that there MUST be some evidence, unless the ban is by vote, in which case I would think it was fair that a certain instance wanted to self regulate, regardless of how much I personally agree that the regulation was valid.

And by vote, I don't mean by poll, I mean that the majority, or even 66% of the instances active users vote yes, not just the majority of people that bother to vote.

> This is a type of user you should ban straight away without engaging. Every single time I've seen this sort of post, it's from someone who gets banned from places a lot for being exhausting.

Am I really this bad?

Because to me, it seems like a lot of people throw “the baby out with the bathwater” when it comes to these things. We don't have to forego ideas of “free speech” or say that principles of due process have to be thrown out because else we can't moderate people. Because we can.

And people that choose to believe that free speech includes the right to throw slurs at random people on the internet have fallen for a conservative political psyop, an instance is not against free speech just because it does not allow harasment, slurs, and other toxic behavior.

The problem in those cases is not the speech, but the intent. The crime is not saying the slur, but the hurt that the slur causes.

Idk, either way I feel like most of what you are saying here is correct, but painting people that ask questions about the process as evil is not something I think is okay. If you're not willing to answer questions regarding the moderation process then you're likely doing something you should not be doing and scared to tell other people imo.

But I'm also one of those crazy people that believe in transparent governance.

@ifixcoinops i strongly disagree! a lot of words including that one are reclaimed by the communities they originally targeted. all the queer are very commonly used in those circles now, i should know i am one!

it feels like it should be easy to have some 'ban list' of words that aren't allowed for discussion. if it was in a forum in 2001 it would probably be for people who liked a certain brand of hammers, and perhaps nitty gritty details of queer identities isn't something focused on enough to worry about allow terms there. but when your context is general purpose for a billion people you can't so easily figure out where people are coming from.

i dont think [good] moderation of a site like twitter is actually possible, and maybe its the biggest hidden issue with non-federated social medias. for now what we have here is the best, instance level blocks and strong user tools to deal with harassment.

@ifixcoinops

That is such a good insight, it makes a lot of sense.

Blocking and isolating trolls sounds like the best way to fight them.

@ifixcoinops People new to administrating do not understand how important it is to have informal mechanisms to report issues. People are afraid to cause trouble, but they'll gossip to no end once you ask them to.

@ifixcoinops I’ve learned two things about community management the hard way:

First, the blast radius of toxic community members is huge and mostly invisible; for every person who speaks up, there are dozens who’ve quietly just left and hundreds of who peeked in, saw what you tolerate and closed that window without a word.

Second: the worst thing your community ignores is what it accepts, and the worst thing your community accepts what it becomes.

@mhoye @ifixcoinops and one thing I’ve learned as an admin: just kick the offenders. No trial. No scrutinizing the rules trying to figure out if they’ve broken them or ”just bent them” or been very close. If they ask you why you can just say that you don’t like what how their presence and behaviour affects the community. Or just that you don’t like them.

It doesn’t matter if they think you’re unfair, biased, or a bully. It’s your community, not theirs.

@ifixcoinops @feditips There's a LOT here that reminds me of the loosely federated systems from the Before Times. Before HTTP, even Before Public Internet.
@ifixcoinops There will be a live streamed mass shooting 8chan style on it within 90 days, bank on it.
@ifixcoinops This is me watching it collapse
@ifixcoinops I honestly don't think Twitter will actually do away with moderation, or if it does it won't be for a long long while. Like, since when has Elon Musk shipped a product remotely on time or in line with how it was promised to be? ;)

(I'm only half-kidding!)
@ifixcoinops @leah @rixx I just need to say thank you for going out of your way at being excellent to the world. I may not technically be part of chaos.social, but I feel a bond to this community and I‘m impressed how complex administering an instance can be.
Trustworthy admins are one fundamental aspect that make the #fediverse special.
@ifixcoinops to me it feels like eye strain when users inevitably choose to post in 4px Comic Sans dark grey on black with fixed 1800px wide paragraphs
@danielcassidy Aye that's why you give them a limited range of things to choose from, which you've tested, and give folks the option to override all that shit :P