See, this — •this• — is what ICE+CBP are actually for under this regime. •This• is why they got a budget the size of Russia’s entire military.

It’s nothing to do with immigration. It’s about the would-be dictator having his own private military that:

- can be deployed domestically
- at will
- unbeholden to the UCMJ etc
- or disciplined military culture
- or any accountability whatsoever
- with unpredictably violent behavior
- applied indiscriminately
- to anyone
- so that the entire population is terrified of it

…because for them, the problem with Jan 6 was that the coup wasn’t violent enough and didn’t have enough weapons.

https://www.theguardian.com/us-news/2026/feb/04/steve-bannon-ice-immigration-agents-polling-sites-midterm-elections

Steve Bannon calls for immigration agents at polling sites during midterms

Ex-Trump adviser adds to elections officials’ concern about potential interference from Trump administration in voting

The Guardian

@inthehands What will it take for Americans to overthrow this fascist regime?

There is no question what is happening. It’s a coup. In broad daylight. 🤷‍♂️

@gimulnautti @inthehands it would be a coup if nobody wanted it. Don't forget that a sizable portion of the population is completely ok with this 😕

@misjavanlaatum
> don't forget that m a sizable portion of the population is completely ok with this

I'm not convinced of that. Most USAmericans have been marinating in a stew of nonsense on DataFarming platforms for more than a decade, breaking down our ability to do collective sense-making;

https://www.programmablemutter.com/p/were-getting-the-social-media-crisis

I suspect what people are ok with is a world that has been pulled over their eyes to blind them from the truth, to paraphrase Morpheus in The Matrix.

@gimulnautti @inthehands

We're getting the social media crisis wrong

The bigger problem isn't disinformation. It's degraded democratic publics

Programmable Mutter
@strypey @gimulnautti @inthehands yeah, you make a good point. Let's definitely not underestimate the role big tech plays in this...

@misjavanlaatum
> Let's definitely not underestimate the role big tech plays in this

Indeed, and as Farrell says in the piece I linked, it's not just a problem of mis/disinformation; unfacts presented as facts. Those platforms feed our subconscious worldbuilding heuristics a persistent false sense of the world, against which factual claims are evaluated, and everyone is being fed a subtly different false sense. Eroding our capacity for social consensus-finding.

@gimulnautti @inthehands

(1/?)

Me:
> eroding our capacity for social consensus-finding

This got me thinking about why the word 'moderator' describes the curator of a communication channel.

There's a habit to mentally substitute 'censor' when we think about that role, even for those who approve of it. But the work of moderation isn't primarily about deciding whether this or that post/ user account is welcome in the channel. That's a means to an end.

#SocialMedia #moderation

@misjavanlaatum @gimulnautti @inthehands

(2/?)

As the name implies, a moderator's core role is to prevent the social environment of the channel they moderate from being warped, by people on the extremes of any given issue.

The goal isn't primarily to eliminate discussion of extreme positions, or people who hold them (although in some cases that's necessary). But to ensure that when extreme positions are discussed, they're understood in that context; as positions that are considered extreme (at least for now) in that community.

(3/?)

What a good moderator is seeking to do is to prevent people pushing extreme views from exploiting a popularity bias. By posting so often, whenever the topic they're extreme on comes up, that they create the false impression that their position is accepted - or even dominant - among people in the channel. Getting more people to accept their view without careful scrutiny because it seems widely held.

(4/4)

This is a big part of how DataFarming platforms nudge people. Not just by presenting nuggets of mis/disinformation, but by flooding the channel with them. Either because they're paid to, or The Algorithms find it increases engagement.

It's not necessary to get everyone to believe specific nonsense. If you can just give them the impression that many people do (eg that they support Big Tech, or what Orange Stalin's administration is doing, or whatever), that can still shape their actions.