Some quick notes on how we might build some of the essential infrastructure and governance processes that will be needed if #Mastodon is really going to be sustainable and viable as a mass-adoption social network (1/n):
a) We need to scale content moderation. A LOT. Corporate social network sites (SNS) do this by using armies of poorly paid, outsourced contractors. The #Fediverse should do it better. Perhaps by organizing a worker-owned content moderation cooperative?
a2) Smaller instances can self-moderate / use volunteer labor / whatever. But large instances will need to be able to scale, so a content mod coop (or a federated network of multiple such coops) that can be hired/contracted by larger instances would be amazing.
a3) Another option is for larger instances to hire more mods directly. Hopefully, some of the larger instances can themselves be organized as cooperaqtives. Probably some combination of in-house moderation & contracts w/coops would work well.
b) funding mechanisms. It is going to take money to scale. For hosting, development, ongoing improvements, #a11y, localization, UX improvements, security, and perhaps most of all, to pay content moderators just wages.
b2) currently, most of the money is in the form of small recurring donations to the german nonprofit that is the largest instance. every instance running its own patreon is part of the puzzle, but it probably can't be the whole thing.
b3) probably there will be a mix, with large donations from individuals, private foundations, and perhaps increasingly some state actors (for example, municipalities, libraries, state agencies, etc) providing contracts. There will also be some companies that want to donate (and contribute coding time, etc).
b4) All that money flowing in, mostly to the largest instances, ideally should be governed at least in part through participatory budgeting mechanisms. Alternately (or in addition), there should be formalized governance mechanisms (elections for the board of the mastodon non-profit? liquid democracy? sortition? stakeholder board members?) to truly democratize resource allocation.
c) Now that the 'don is taking off, from DIY small community to wider adoption, intentional bad actors are in the mix at scale. We will need to take this seriously, and invest HEAVILY in various approaches to minimizing harm, constantly working to block and limit bad actors, defederate the worst instances, and ... create our wildest dreams in terms of care, follow-up, and support for community members after troll attacks!
c2) we control the fediverse, not the market, the state, or the billionaires, not surveillance capitalism, not ad markets, so why would we limit our dreams of how to create community safety to content moderation alone? Let's dream bigger. We can create (and resource) new tools, implement shared banlists, provide resources for rapid response teams and after-attack processing support, and so much more!
(pause for now as I'm heading to a budget meeting, hope to return soon with more).
@schock I’d like to have a third party moderation services which can be contracted by any social media platform with tools to review content and take action to enforce community rules. Actions could be reversed with an appeal process. And I believe actions taken by humans could be used to train ML models to scale moderation. I’d also like to be able directly hire a “bot” which would handle moderation for me. It should also train an ML model. cc @cd24 @seb
@schock @cd24 @seb I also think governments could run their own official Mastodon services much like any .gov website is run. Large companies could also do this for their own accounts which provide customer service and marketing. Let them fund what they use. They can also fund their own moderation.

@brennansv @schock @cd24 @seb yes (need to catch up on the whole thread) - I think hosting an instance and moderation may be separate services and both likely candidates for a government contractor (and as an offering for other orgs and businesses).

It will take more than just setting up the servers. Governments likely need a plan for archiving & record retention (as will any regulated businesses). And managing shared accounts & access /security

@brennansv @schock @cd24 @seb and if it wasn’t clear from my reply I’m seriously looking into what it would take to start such a business (and have ideas for additional features). It may need to start with a different ActivityPub platform (or may need to support a range of them) but I think it will be hugely valuable for businesses, governments, schools, nonprofits and orgs like unions to run their own social media on the Fedisphere
@Rycaut Do it. I know we need it. Many have had to leave social media because it became such a bad experience for them. Several actresses have gone through terrible experiences. If they could have used a service to moderate it maybe they could still engage with fans in a positive way.
@Rycaut I want all social media platforms to support a moderation API. One example, if I run a YouTube channel I could choose a moderation service to moderate the comments with my policies. Maybe there would be a selection of community rules I could choose to apply.
@brennansv that could be nice but is also immensely complicated (youtube for example has to adhere to rules related many different countries - EU privacy rules, German specific rules etc) so I’d imagine it might be a baseline they have to do + additional options chosen by the creator. Challenge also is how to do this without abusing the moderators & without encoding various problems via biased AI etc. not an easy challenge in the least.
@Rycaut MKBHD made a video about this. There is a script he can run to clean up his comments section. https://youtu.be/1Cw-vODp-8Y
YouTube Needs to Fix This

YouTube