RE: https://mastodon.social/@tiffanycli/116291084252862983

If there's one piece you read on this make it @mmasnick's excellent breakdown. I wish I could highlight every word, but setting up a system to circumvent Section 230 is not a good thing https://www.techdirt.com/2026/03/26/everyone-cheering-the-social-media-addiction-verdicts-against-meta-should-understand-what-theyre-actually-cheering-for/

@taylorlorenz @mmasnick

What a weird argument. Infinite scroll doesn't work on videos of paint drying? It's the content that's the problem? Well "addictive" content also isn't addictive without the infinite scroll and algorithmic sorting, so the point doesn't really stand. It's the design creating the content that fits the design. I don't see a problem in targeting the design.

@diageo
> If every editorial decision about how to present third-party content is now a “design choice” subject to product liability, Section 230 protects effectively nothing. Every website makes decisions about how to display user content. Every search engine ranks results. Every email provider filters spam. Every forum has a sorting algorithm, even if it’s just “newest first.” All of those are “design choices” that could, theoretically, be blamed for some downstream harm.
@diageo
> The California jury awarded $6 million total — $4.2 million from Meta, $1.8 million from YouTube. For companies that bring in tens of billions in quarterly revenue, that’s effectively nothing. [...]
But that’s exactly the problem. The real cost here is the process. The California trial lasted six weeks. The New Mexico trial lasted nearly seven.
Meta can afford that. Google can afford that. You know who can’t? Basically everyone else who runs a platform where users post things.
@diageo
> Now, any company that makes any “design choice” about how to present user content — which is to say, literally every platform on the internet — is potentially on the hook if any harm comes to any user which some lawyer can claim was because they used that service. The lawsuit becomes a weapon regardless of outcome, because the cost of defending yourself is ruinous for anyone who isn’t a trillion-dollar company.
@diageo
> One of the key pieces of evidence the New Mexico attorney general used against Meta was the company’s 2023 decision to add end-to-end encryption to Facebook Messenger. The argument went like this: predators used Messenger to groom minors and exchange child sexual abuse material. By encrypting those messages, Meta made it harder for law enforcement to access evidence of those crimes. Therefore, the encryption was a design choice that enabled harm.
@diageo
> under the “design liability” theory, implementing encryption becomes evidence of negligence, because a small number of bad actors also use encrypted communications. The logic applies to literally every communication tool ever invented. Predators also use the postal service, telephones, and in-person conversation. The encryption itself harms no one. Like infinite scroll and autoplay, it is inert without the choices of bad actors - choices made by people, not by the platform’s design.
@diageo
> The incentive this creates goes far beyond encryption, and it’s bad. If any product improvement that protects the majority of users can be held against you because a tiny fraction of bad actors exploit it, companies will simply stop making those improvements.

If you actually want better social media fight for data privacy, tech sovereignty and interoperability.
Data Brokers and all that industry that work w personal user data are much more harmful and insidious than social media.
We need to slash incentives for social media companies to grab and sell as much data as they can.

@diageo

@vuuc I agree that sounds like a dangerous precedent. Encryption shouldn't be seen as the enemy.
@vuuc design choices can cause harm though. We shouldn't let deceptive design go unchecked because only the largest companies will be able to fight legal battles. That's a bigger problem of an overlitigious country rather than an issue with tackling choices in design.
@taylorlorenz @mmasnick is t he part of the board of blue sky? How can I expect he wo t have a different opinion than that?

@taylorlorenz
> But under the “design liability” theory, implementing encryption becomes evidence of negligence, because a small number of bad actors also use encrypted communications.

Let's see if this applies to guns \s

@mmasnick

@taylorlorenz @mmasnick
This victory will turn to ashes when you taste it...
This is going to be used for justification of so-called "age verification" surveillance and software backdoors. Which will be used by ICE