Watching the House Oversight hearing and the ex-Twitter witnesses are doing a great job while being attacked by pretty much everybody.

Some initial reactions:
1) Despite the high-profile nature of this hearing, members don't seem much better prepared than in other tech hearings. Lots of misuse of technical terms (Rep. Jordan asking about "hard coding" multiple times) and confusion on the Hunter non-consensual tweets.

https://www.youtube.com/watch?v=-Fo_yD8r3w4

Full Committee Hearing - Part 1: Twitter’s Role in Suppressing the Biden Laptop Story

Learn more at https://republicans-oversight.house.gov/

YouTube

2) The panel, especially Yoel, is doing an excellent job of staying calm and explaining why the wildest theories exposed by the committee are not supported by the evidence.

3) This is a clear demonstration of the no-way-to-win dynamic on all political content moderation. I expect it will have the (intentional?) effect of reducing the willingness of companies to take any action on political accounts. This was a dynamic @evelyndouek, Nate Persily and I discussed here:

https://moderated-content.simplecast.com/episodes/meta-reinstates-trumps-accounts-7l3bLsEf

Meta Reinstates Trump's Accounts | Moderated Content

Evelyn sits down with Nate Persily, Professor at Stanford Law School, and Alex Stamos, director of the Stanford Internet Observatory, to discuss Meta's decision that it is reinstating former President Trump's accounts. Nate is pragmatic, Alex is cynical, and Evelyn is a naive little formalist about it all. Here's their quick takes.

Moderated Content
4) Rep. Taylor Greene is fortunate that the Speech and Debate clause protects her from a slander lawsuit from Dr. Roth as she once against repeats a lie (one amplified by Mr. Musk) about his PhD dissertation that will drive even more abuse and threats his way.
@alex no one who hates Americans should serve
@alex @evelyndouek honestly you would be better off rebasing any kind of social media company in Europe at this point and avoiding this bullshit.

@mdh @evelyndouek Eh, no. This is about to get much worse in the EU as there will be years of litigation around the speech rules of each member state and the DSA.

Unfortunately, no major democracy is going to be able to resist trying to shape the speech of their citizens online.

@alex @evelyndouek I guess I was thinking in terms of the partisanship and assuming free speech maximisation wasn’t your end goal. Europe seemed calmer and more predictable in that sense but probably doesn’t do well with US free speech discourse to be fair.
@mdh @evelyndouek I think it's much less predictable because companies operating in the EU face the same dynamics, empowered by the DSA, in 27 governments ranging from socialist parties or Orban's Hungary.
@alex @evelyndouek going to defer to your expertise and experience on this one. Fair enough.

@alex @evelyndouek

Framing it as "no-way-to-win" is a false dichotomy. Both the panel and podcast did not discuss a third option, informing users when their content has been moderated, be it removed or reduced. In fact, you still champion new "reach reduction" techniques (16:12). That's the wrong approach.

Facebook still has a "Hide comment" button. Reddit still has over 50% of accounts that have removed comments they don't know about because the system hides the removal from them.

Platforms could take the hard step of being transparent, and we can hold them to that. Asking them to remove more content before demanding transparency makes you company to the censorship regime, or, a lot like those who want to build lists of toxic users without concern for how that data is used:

https://old.reddit.com/r/TheoryOfReddit/comments/ymeqaz/hate_on_reddit_a_global_lists_of_toxic_users/iv4aet6/

Hate on Reddit: A Global Lists of "Toxic" Users

This is problematic when combined with Shadow Moderation, which is how comment removals work on Reddit (comment in...

reddit
@rhaksw @evelyndouek Yoel's testimony specifically talked about moderation transparency, and you are apparently not familiar with my work as I've been championing such things for years. https://youtu.be/ATmQj787Jcc?t=1823
The Platform Challenge: Balancing Safety, Privacy and Freedom — Alex Stamos (DataEDGE 2019)

YouTube

@alex @evelyndouek

Alex, I built Reveddit. My argument pertains to the ordering of your priorities.

Demands for transparency must be louder than demands to remove disinformation. In other words, it's not worth helping platforms flag content when the moderation, be it removal or reduction, is done without the knowledge of that content's author.

The reason we're all stuck on the same platforms is people don't know they're getting moderated, both on the left and the right, on every major platform. That's no "wild theory".

About Yoel, he only advocated transparency of rules, not judicial outcomes as you related in the linked video,

https://oversight.house.gov/wp-content/uploads/2023/02/Roth-House-Oversight-opening-statement-V4-Final.pdf

"what we tried to do at Twitter — across every decision — was to create a rules-based system of governance that would make clear what’s allowed, or not, on Twitter, and why. Transparency is at the heart of this work, and it’s where I think Twitter — and all of social media — can and must do better"

@alex The FBI had possession of the Hunter Biden laptop, and was feeding misinformation to social media.

“It is also axiomatic that a state may not induce, encourage or promote private persons to accomplish what it is constitutionally forbidden to accomplish.” ~ Norwood v. Harrison (1973).

@PirateRoberts A hearing on the interaction between the FBI and other USG agencies and Twitter is totally appropriate, although no evidence specifically tying Twitter's decision on the Post story to FBI information has surfaced.

However, if it is inappropriate for government actors to put pressure on private actors to shape 1A protected speech decisions, then this hearing is also inappropriate. Using the subpoena power to punish these individuals for using their 1A rights is also jawboning.

@alex Who is being punished? If it is individuals being paid by taxpayers, who have positions of trust, then yes, it is appropriate.
@PirateRoberts The Twitter witnesses are private individuals who made decisions within the 1st Amendment protected ability for Twitter to decide what speech it carries and amplifies. A member of the House of Representatives strongly implied one of them, who was compelled to appear by the power vested in the committee, is a pedophile. That is clearly and example of using the power and funding of the US Government to punish a private individual.