Curious to follow reactions to the recent US cases on social media design.

A helpful & nuanced rundown is here: https://courtneyradsch.substack.com/p/the-algorithm-loses-its-immunity

It seems as if a big group welcomes the approach taken, while some warn of negative effects for smaller platforms and encryption.

Also interesting to see this in the context of the EU's emphasis on algorithms/design, e.g., on TikTok.

"At this stage, the Commission considers that TikTok needs to change the basic design of its service." https://digital-strategy.ec.europa.eu/en/news/commission-preliminarily-finds-tiktoks-addictive-design-breach-digital-services-act

The Algorithm Loses Its Immunity

What two jury verdicts mean for platform liability, AI systems, and the governance frameworks still taking shape.

Information & Power

Here's a great and more detailed rundown of the two groups I mentioned @caseynewton:

https://www.platformer.news/social-media-trials-230-content-design/

His conclusion: "Section 230 continues to do a lot of good, and should be handled with care. But to argue that it must be frozen in amber and preserved at all costs is to risk protecting an abstraction at the expense of actual people. Juries have begun to realize that, and one way or another, platforms are going to have to deal with the consequences."

Can you have child safety and Section 230, too?

The verdicts in last week’s social media trials have alarmed open-internet advocates. But it’s possible to regulate platform design while also protecting speech

Platformer