For the last two days, Elon Musk has been publicly freaking out about "EXTREME levels of data scraping," so added "temporary emergency measures" like blocking logged-out views and adding tight rate limits on viewing tweets. But, apparently noticed first here by @sysop408, a Javascript bug in the Twitter web app is self-DDOSing their servers, sending an endless loop of requests — which seems related to their scraping panic. https://waxy.org/2023/07/twitter-bug-causes-self-ddos-possibly-causing-elon-musks-emergency-blocks-and-rate-limits-its-amateur-hour/
Twitter bug causes self-DDOS tied to Elon Musk's emergency blocks and rate limits: "It's amateur hour" - Waxy.org

An "amateur hour" Javascript bug is self-DDOSing Twitter, sending infinite requests from users related to — or possibly even causing — Elon Musk's "temporary emergency measures" to stop web scraping.

Waxy.org
@andybaio @sysop408 What's data scraping anyway?

@xabitron1 @andybaio @sysop408
Means: 'remote machines that are not authenticated as users are accessing large numbers of resources in a way that suggests they are collecting and archiving resources for their own aims'. You do sometimes get abusive data scraping, for example, there are galaxy-brain machine learners who aggressively scrape resources without a thought to the server.

Competent sysadmins have ways of squashing that kind of thing. Also you can buy services to deal with it.

@xabitron1 @andybaio @sysop408
Basically when a program reads stuff from a website. It costs money to serve a request, but humans can't consume fast enough for that cost to be an issue. If a program does it, it can become one, depending on what the request is and how much it does that.