As Mastodon becomes more popular it will face the same content moderation problems t hat come with scale and increased scrutiny; DMCA takedown notices, CSAM, hate speech, government influence campaigns, GDPR compliance, etc.

The fe derated model raises interesting questions on who is responsible or liable in each of these c ases. The only hope to avoid these is the usage staying low enough to avoid scrutiny.

https://www.wired.com/story/mastodon-legal-issues-tipping-point/

Mastodon Is Hurtling Toward a Tipping Point

As the niche, decentralized social networking platform rises in popularity, it faces rising costs, culture shifts—and potential legal risks.

WIRED
@carnage4life Or be a web hosting company who was founded by industry experts who have done this for 15+ years
@d3cline The idea that web hosting companies have the breadth of experience to deal social media regulatory issues is optimistic, dare I say, even naive.
@carnage4life @d3cline The way this has historically been handled is exactly at the hosting level, though, in which abuse takedowns for a domain are sent to the host for resolution, and using the hosting model, moderation/regulation would fall to the individual instance. Forums/message boards are proto-social media and have followed this model. e.g. EU DSA requires 45MM+ users; will they consider an entire network the service? Most regulatory action I’m aware of is tailored to a few actors.
@invariant @carnage4life Exactly. Its the same as a forum. I copied and pasted our terms of service because they fit exactly. We already paid for that attorney. ActivityPub only adds the back-end communication to a queue. To me nothing is new from the other end of the problem. Its all how to deal with ActivityPub and sell it. This was not a hard problem for our company to solve and move past. As for moderation of other instances, if they are not violating the law its up to them to self moderate.