Can you have child safety and Section 230, too? I think so. But it requires acknowledging the distinction between speech and design: https://www.platformer.news/social-media-trials-230-content-design/
I think calling it a "teen mental health crisis" is giving teenagers too little credit and adults too much. Many of the same features that addict and abuse teenagers do the same to adults. I don't think we should view this through a lens of "child safety" (with all the political connotations that has), when it affects everyone. I think we're currently in the period much like the pre-"cigarettes are known to be addictive" one. *Post* "there's a warning on every pack", it's reasonable for people to take responsibility for their own usage (as long as it isn't around others). *Pre* "there's a warning on every pack", it can't be written off as "personal responsibility".
@josh @Casey section 230 states that platforms are not responsible for user-generated content since they do not perform an editorial role. But clearly the For You feed algorithm seems editorial. I propose a new rule:
Platforms are not responsible for user-generated speech. But platforms are responsible for what speech they show you via their algorithms. Reverse chronological feed of people you follow is exempt, as is search results.
I'd want to be a bit broader than "must be reverse-chronological" (e.g. let's not require a legislative update to add creative things like "show me a subset list of people I've curated" or "show me posts with this tag I specifically bookmarked"), and I don't especially think Section 230 is the right tool for this, but the general *concept* of "restrict algorithmic-optimization-for-engagement-and-enragement" seems sound.