I want to echo what other Mastodon admins are saying: please report any harassment or violative messages you're seeing. Unlike the birdsite, your local admin will likely react quickly -- and you'll make the instance better for everyone, not just yourself.

So... please don't just ignore! Let us know if there's a problem.

@cgseife how do people report…..?
@scullingmonkey At the bottom of the message, do you see those three dots? That should bring a dropdown that gives you the option.
@scullingmonkey @cgseife but what if i want to report some bots from twitter that came here and are using certain hashtag? (far-right in Mexico) they are trending something to create chaos in Mexico
@cgseife I think that this is overestimating the human, financial and mental capacity that hobbyist admins will have to moderate as instances scale. As bad as Twitter does it they have (had) alot of resources to throw at moderation.
@walla Perhaps... you're right about funding. But I already see an advantage to federation... if your admin doesn't manage well, you can move to another instance that hasn't deteriorated. Who knows how the scaling laws will apply, but I'm hopeful.
@walla Very true. If you're old enough to remember usenet, it has a similar feel.
@cgseife like maybe two years too young for that, or just not hip enough. IRC had a similar feel back in the day. @cgseife I noticed the problem mentioned in the link (lolicon) in 2017 on Mastodon, not so much now. The reason that Facebook and other platforms aren't flooded with child sexual abuse material and gore is because of the scores of minimum wage workers they hire to moderate, mainly in the developing world. We will see how it all works out...
Mastodon: What is the social network hailed as a Twitter alternative?

With Twitter in disarray since the world's richest person took control of it last week, Mastodon, a decentralised, open alternative from privacy-obsessed Germany, has seen a flood of new users.

Reuters
@cgseife @walla Right - I think smaller instances will be able to keep up. Maybe less so if people are diving into the federated timeline and reporting everything they find there, but I know that after having built up a set of follows I spend most of my time on home and local. Both of which I suspect don't scale too much with the size of the network as a whole. (There's only so many follows a person can have and stay sane!)
@walla @cgseife People here are much more likely to self moderate because the instances are smaller. There will of course be some instances with nazis and other dark stuff on them but they will be very isolated and hard to find from well respected servers. Reputation plays a much bigger role here than on the birdsite or on other centralised platforms
@Novice_Idiot @cgseife I disagree that the size of a platform makes people more likely to self moderate (e.g. kiwi farms). Also if the platform requires understanding of reputation it also requires media literacy and extensive experience of the internet, this is not necessarily a safe space for children or for new internet users/people who don't get to access the internet much (e.g. the developing world).
@walla @cgseife True, in general this platform isn't safe for children. It would be quite easy to create safe spaces for children however. They could for example be moderated and operated by parents. On other platforms like Facebook and instagram everyone has access to the whole platform, here one can restrict what one sees much easier.
@walla @cgseife Well, that's certainly something to consider. They will certainly be rushed off their feet right now with the Influx. But moderators only have to moderate their own instance, not the whole of Mastodon. I suspect that moderators-per-user is a *lot* bigger than on birbsite.

@walla
> [Titter] have (had) alot of resources to throw at moderation.

I think you overestimate the human, financial and mental capacity that a single organization can throw at moderation, while also doing almost all their software dev in-house, and sysadmin for a single instance of hundreds of millions.

The fediverse has at least one human mod per instance (the admin), and usually more as their user numbers grow and users want to help safeguard their own experience.
(1/2)

@cgseife
@jfred

@walla So the number of moderators scales naturally with the growth of users. Then there's the way users can mute/block entire instances, and admins can silence/ defederate them for all their users, instead of always having to play whack-a-mole with individual trolls. Combine all this and moderation is actually easier in the 'verse. Check out this fantastic video about all this but @derek :

https://conf.tube/videos/watch/d8c8ed69-79f0-4987-bafe-84c01f38f966

(2/2)

@cgseife @jfred

Decentralized Social Networks vs. The Trolls

PeerTube

@cgseife

I'm interested in how people are handling cross site moderation.
Do moderators reach out to home moderators to say "hey this guy is being a shitlord, will you do anything?"
if i or someone on my instance is being a shitlord, i wish to be told

@xorowl Good question. So far, I've just blocked, with one exception: when, in snooping about whether it was just a user or a whole site that needed to be blocked, I found that another instance was going to be targeted for harassment... and I notified that instance's admin privately.

I haven't posted my blocklists publicly just because I can see how they'd be used for targeting, but would be willing to share with other admins who ask.

@xorowl And, so far, I have done the nominal report that goes back to home moderators before blocking... though in all cases I've seen so far, they come from edgelord instances so reports are likely laughed at.
@xorowl And this is probably more what you're interested in, but -- knock wood -- I haven't had any complaints about people on my instance yet. But yes, I would definitely want to know, and would act upon any reports if appropriate.
@cgseife @xorowl i think its pretty easy to weed out bad actors when your site only has a couple dozen users. But if you let bad actors run amok on your instance you may get defederated by other admins. Most malicious instances are that way intentionally so blocking an entire instance is usually the first thing I do when I see obviously abusive behavior but really most people are considerate so it is rarely a problem
@xorowl @cgseife on some of the apps, at least, when reporting there is an option to forward on the report to the admin of the other instance. Won’t help if the admin doesn’t care, of course, but that approaches defederate from that instance territory
@cgseife people have been *so* trained by big social media that reporting users or posts will do absolutely nothing, it's so sad
@cgseife
Newbie question: don’t need this at the moment, but how do reach an admin to report?
@mm You can either tag directly in a toot, or, if it's about a specific message, you can click the three dots at the bottom... that allows you to report it to the admin.
@cgseife thanks! Obvious really, but my brain is in bird mode still.
@cgseife I reported someone and he was quickly removed. Impressive!!
@cgseife how do we report them here?
@mmylova Open the bad post; click on three dots; drop down menu should give you the ability to report.
@mmylova the ellipsis at the bottom of the post opens a drop-down menu where you can report

@cgseife

Is there moderation for misinformation? I’m yet to see any anti-fact toots but there is certainly going to be a huge influx of election misinformation in 36 hours. I’m used to confronting a rising tide of anti-vax & covid nonsense on the Bird site. If it does spill over on here will it be dealt with? (I noticed this was a speciality of yours)

@The_Ouroborus It's gonna be tough, TBH. As an admin myself, I'm not sure how I'll be able to vet things... I'll keep an eye on trends and links, but there's too much volume to do much more than that.

On the other hand, the ethos of the community seems such that I'm guessing it's less of a problem -- and I suspect that misinformation is less viral here.

So... short answer: probably not very effectively, but I think the problem is currently less here. In 2 years, things may be different...

@The_Ouroborus ... but I'm betting that if Mastodon really catches on, the structure of the federation will be pretty different, too.

(I'd bet on corporate media instances with their reporters on board becoming dominant.)

Could be very wrong, though. :)

@cgseife From my limited experience of it there seems to be room for adaption and growth. I have had limited experience of it and in comparison to twitter it’s very peaceful. At the moment. From the chatter I’ve seen over the past couple of days people are really wanting to move on. This week could be pivotal.
@cgseife I’m used to the confrontation which is hard wired into twitter. I think unfortunately the disinformation is going to grow on the other site and it’s something which concerns me greatly. Like many, I think I will have to adjust to speaking to people who respect the science or political systems again. Relearn the art of civil conversation! It will be really interesting to see how this site adjusts to swathes of people, utterly clueless about what we are doing.
@The_Ouroborus I feel exactly the same way. I'm going to be struggling as an admin about when a disagreement becomes a confrontation worth stepping in for... I'm planning on using a cocktail-party model. I hope it's a good line... and that it holds!
@cgseife In the main I think people can self moderate (though while arguing about whether Covid exists, vaccines kill or whether people have died I’ve been called so many names that moderation becomes more simple). Maybe this can grow as a more mature microblogging site which respects expertise and doesn’t reflexively abuse others.
@cgseife I am delighted to say that the only abuse I've seen in a week, was in screenshots taken in some gamer community instance I would never be found in anyway.
@cgseife I've reported 6 racist and/or homophobic/transphobic accounts on Twatter recently, some were openly calling for people to be shot or die in other ways. All 6 reports rejected as 'not a violation of our rules'.
@pikminlover Yeah. About 50% of the stuff I report -- pretty egregious stuff -- winds up having no action taken. Part of the reason for the tone of the place.
@cgseife new here, sounds good. "Birdside", goon one
@cgseife  This! The report button on mastodon actually works! 
@cgseife @StephenMcGann — might I just qualify that a little though, please? Whilst your local admin will almost certainly want to react quickly, please remember that unlike the birdsite local admins will almost certainly have jobs in the daytime and families in the evening so if they don’t react within five minutes of a report they won’t deserve getting a hard time over it.
@cgseife how does one report stuff to an admin? Haven’t had any yet, but I’m a Girl Scout…
@AvisHG Under a post, there's three dots next to the favorite star. Click on that, and you'll get a drop-down menu that allows you to report.
@cgseife this is an interesting issue vs twitter because unless you’re on the same site how would you know if a post violates that sites policy?

@graeme_0 Good question, and a flaw with federation. (As it is with laws, too!)

I think there's at least some agreement on what basic decent public conduct should be -- there's always going to be edge cases, but I think that if you post in good faith with reasonable decorum, you're in good shape on most instances.

@cgseife thanks, i saw already some far-right mexicans in mastodon =="" like...they were the problem on twitter and they came here now
@monbrielle Very interesting! I'm not enough of a veteran to know the best strategy, but reporting back to your admin who would be able to help things locally would be a start. The #fediblock thing might be a way to try to deal with it globally.
@cgseife thanks, will do because they are more than 100 and repeating the same things they have done on twitter (spam the same toxic hashtag)
@cgseife
How can we look up the rules/guidelines for different servers?
@TammyGentzel The about page... click "learn more."