When talking about the energy consumed by LLMs don't be fooled by arguments focusing solely on the direct power consumption of these models, because they are externalizing a lot of it. Browsers are now doing proof-of-work calculations to access many websites, because websites need to protect themselves from AI scrapers. That takes power! Let that sink in: every computer, tablet or phone on earth is now consuming more power *every time it accesses a webpage* because of "AI".
@gabrielesvelto hey, some sites aren't using proof of work protection, they're getting hammered instead.
@dysfun @gabrielesvelto Which results in the same if not larger energy bill.
@gabrielesvelto scrapers are not merely a consequence of AI.
@Qbitzerre
The problem is not "X exists", but "X has reached absurd amounts of traffic that everyone now has to mitigate against". That is from AI.
@gabrielesvelto

@tarmil @gabrielesvelto perhaps. Does the existence of AI incentivize more entities to deploy spidering than in the past? I suppose the premise is that grabbing content is a more compelling prospect if one can employ processing with greater ROI than simply indexing and/or mirroring.

It makes some sense but is hardly assured.

@Qbitzerre @gabrielesvelto I'm not sure why you're speculating here, it's an observable fact. Countless sites are getting hammered by bots from AI companies, at a scale that had never happened before.
@tarmil @gabrielesvelto I speculated about the future.

@Qbitzerre @tarmil @gabrielesvelto https://www.theregister.com/2025/08/29/ai_web_crawlers_are_destroying/

Not just perhaps. Not just many more entities. Far, far more aggressive crawling. Crawling that doesn't take everything and leave, it takes everything and then starts again. Crawling that doesnt respect robots.txt in the slightest.

AI web crawlers are destroying websites in their never-ending hunger for any and all content

Opinion: But the cure may ruin the web....

The Register
@miclgael @tarmil @gabrielesvelto good coverage there. Thanks for the link.

Seems like a lot of site admins can show spectacular increases in frequency and volume of scrapers as well as in disregard for previously established norms (e.g. robots.txt) since scrapers are looking for LLM training material.

@Qbitzerre @gabrielesvelto

@Landa @gabrielesvelto I don't doubt it. I've seen these reports too. I wonder if it will persist at the same levels for long, or if it is a gold rush.
@Qbitzerre I hope it stops one way or the other.
@gabrielesvelto
@gabrielesvelto @Qbitzerre scrapers were around for decades. They became a problem with ai. Splitting hairs won't help. We did not need to do this shot five years ago. We need it now. You can draw a clear line from AI scraping to these proof of work checks. It *is* something AI made worse.
@claudius @gabrielesvelto not splitting hairs. Nor am I denying that AI exacerbates the problem - as it similarly increases demand for cycles/energy in many dimensions. Merely considering the issue in broader terms. Why must this be a debate?
@Qbitzerre @gabrielesvelto But the AI scrapers broke the social contract.
We allowed search engines to index our websites and they displayed links to our sites on relevant searches. There was also a “gentleman’s agreement” that they won’t index parts of our sites if we ask them not to.
The AI bots now scrape the same thing multiple times, they ignore robots.txt and we get nothing in return.

@grumpyoldtechie @gabrielesvelto I had the impression that a robots.txt file is hit or miss with regard to compliance regardless of the purpose for scraping. Beyond that, the rapaciousness of scraping is at least somewhat subjective, existing on a spectrum that is in part dependent on the application and purpose.

The qualitative difference of AI is that derivative products exemplify a novel formulation with regard to what might have been deemed fair use in the past.

@gabrielesvelto Proof of Work! Does that mean that it's taking 3 times the power of credit/debit card transactions....that's occurring all the time, more times than ever before? !!!!! 
@gabrielesvelto The next round of what Spam is doing for 20y
@gabrielesvelto Good point. The pursuit of surveillance at ALL costs (and our expense!)

@gabrielesvelto
Another stupid power-related argument:
"Humans also consumer power" or "Humans consume more power per task" or anything of that stripe.

Humans consume energy at a near-constant rate.
They keep consuming energy when you take away their tasks to feed to the autocomplete machine.
The only way to "save" that energy is to kill those humans, but killing humans is extremely energy intensive (not to mention fucking evil).
So how exactly do these TESCREAL ratfucks expect this "but humans also..." argument to work?

The humans still exist, hence the energy spent on the autocomplete machine == wasted.

@gabrielesvelto Even earlier than that, there are the energy and resources consumed to build the chips and servers those LLMs run on.
@gabrielesvelto don't forget the bandwidth. That all uses energy as well
@gabrielesvelto @karl and most numbers floating around are about inference, not training (which is vastly more power hungry). and they never factor in the carbon/energy footprint of manufacturing the hardware.

@gabrielesvelto YES. This is having a daily impact on the useability of the ENTIRE world wide web, as we repeatedly have to prove we are human, or see sites go dark due to the constant DDOS attacks that webscraping LLM bots bring.

#AI #NoAI #GenAI #LLM

@gabrielesvelto My grandkids won't die if they have to go without artificially intelligent google surveillance.
They *could* be killed in the fire, floods and famine coming our way from climate change.
Meanwhile, AI, with its greed for energy and water will double the speed of approaching climate disaster.
But (boo hoo) billionaires won't be able to even more deeply analyze, direct, and monetize our thoughts and attention
BAID!
Billionaire Assistance In Dieing.