Like email? So I use Microsoft Outlook?
"Use WHAT"
(Tools -> Account Settings -> RSS Feeds -> New -> https://mastodon.social/@Gargron.rss)
๐๐๐๐Building digital ecosystems at Google, but opinions my own. He/him.
Black Lives Matter
| Blog | https://mekka-tech.com |
| https://twitter.com/mekkaokereke |
My out of office auto responder:
Hey friends, I'm out this week for the holiday. Please do hesitate to reach out.
If you know my mobile #, you may send me Beyoncรฉ gifs. I will try very hard to not look at my email until I'm back after Thanksgiving.
A question people have had for other social networks over the years, is: should the limit at which you grow be how fast you can expand technical infra? Or should you also consider your ability to safely moderate communities that you serve?
Other questions:
Is the false positive / false negative moderation rate higher for marginalized women? If so, what can we do to mitigate that?
One of the weirdest paradoxes about a bad customer service experience, is that you can *increase* people's perception of your service.
Gargron's transparency about a few things has been great:
* Admitting they made a mistake
* Explaining how the mistake happened in each case
* Being transparent that they are struggling to scale moderation
* Correcting the mistakes
It's easier and faster to scale tech infra than to scale moderation.
No, it was a moderation mistake, based on a report of an impersonator account.
There's a bot instance that relays popular Twitter accounts on Mastodon. She tweeted that there was an impostor account on Mastodon, that wasn't really her. Someone reported her real Mastodon account with a link to her tweet. Mod erroneously deactivated her real Mastodon account, thinking it was the imposter. Mods reinstated and apologized.
Moderation is hard to ramp up quickly.
@nebuchi I'm a hopeless optimist! ๐
I believe that Gagron doesn't want it to be true that Mastodon doesn't work as well for Black users.
I believe that his ears are more open to feedback now than they may have been in the past.
I believe routing users to smaller instances with better moderation in the short term, and scaling effective moderation on the main instances in the medium term, is preferable to defederation.
Ultimately I want most large instances to be safe for Black users
Exactly. Any social network with sufficient moderation realizes that this is part of the problem space: malicious users target Black women with a flood of false and fake terms of service violation reports.
This happens on FB, Instagram, YouTube, Twitter, etc.
Inability to control for these false positives and cynical reports, creates a network that doesn't work as well for Black women or other targeted groups.
@mortentoudahl No one is saying this was done "based on colour." I don't believe a moderator sat around and consciously said, "I don't like Black women! Therefore I'm going to mess with this account!" That's not what happened.
Let's say false positive moderation events impact 0.1% of all new users, but 10% of Black women users, is that something that you think we should look into fixing?
As a general rule, Black people are not very worried about "openly racist." Who wears Black face, who says the N-word, who has a Klan hood. White folk are almost overly concerned with openly racist.
Black folk are much more concerned with systemically racist. Who smiles, but doesn't hire any Black people. Who laughs, but calls the cops on Black neighbors. Who says a system is "working as intended" even if it harms a large % of Black folk.