This is a great summary by @SashaMTL of the environmental and human costs of so-called "AI" technology.

https://arstechnica.com/gadgets/2023/04/generative-ai-is-cool-but-lets-not-forget-its-human-and-environmental-costs/

>>

The mounting human and environmental costs of generative AI

Op-ed: Planetary impacts, escalating financial costs, and labor exploitation all factor.

Ars Technica

@SashaMTL

"For instance, with ChatGPT, which was queried by tens of millions of users at its peak a month ago, thousands of copies of the model are running in parallel, responding to user queries in real time, all while using megawatt hours of electricity and generating metric tons of carbon emissions. It’s hard to estimate the exact quantity of emissions this results in, given the secrecy and lack of transparency around these big LLMs."

https://arstechnica.com/gadgets/2023/04/generative-ai-is-cool-but-lets-not-forget-its-human-and-environmental-costs/

>>

The mounting human and environmental costs of generative AI

Op-ed: Planetary impacts, escalating financial costs, and labor exploitation all factor.

Ars Technica

@emilymbender @dekkzz76 @SashaMTL All of this illustrates as well as anything how amazing a computer, the human brain is. It does qualitively much more than ChatGPT and its ilk, in only a tiny fraction of the energy consumption.

Some seem to think that LLMs will radically change how everything is done. I wonder if that will actually scale, are there even enough computing power to give everybody their own LLM?

@mapcar @emilymbender @dekkzz76 @SashaMTL
That depends on whether you can give everyone their own 4GB laptop. That seems to be the minimum hardware requirements for the smallest LLaMa AI although someone did manage to run Alpaca on a Raspberry Pi but generating one token per minute you might be waiting quite some time for an answer
https://medium.com/@martin-thissen/llama-alpaca-chatgpt-on-your-local-computer-tutorial-17adda704c23
LLaMA & Alpaca: “ChatGPT” On Your Local Computer 🤯 | Tutorial

In this article I will show you how you can run state-of-the-art large language models on your local computer. Yes, you’ve heard right. For this we will use the dalai library which allows us to run…

Medium

@bornach @emilymbender @dekkzz76 @SashaMTL It could certainly be that coming improvements in hard- and soft-ware will improve the energy consumption of AI but it probably will not help, as I incidentally just learned from Molly Woods newsletter.

If you worry about climate, sustainability and the future of the planet, this is a highly recommended read (and very relevant to the original post): https://www.mollywood.co/p/how-ai-kills-us-energy-use

How AI kills us: energy use

Hey, listen, I get that we're all excited about AI and the rush to humanity-destroying artificial general intelligence. But learn your lessons from The Matrix, children. It's the energy that kills ya.

Molly Wood Media