I find it unlikely that someone might be on Lemmy and not be aware about the basics of how ““AI”” actually works. But if you don’t know, truly, the rundown is that all of these AI apps you use are just an interface where you can make a request to an ““AI”” do something. The ““AI”” is not running on your computer. It’s like sending someone a message “hey, do this for me” and they will do it, and then saying you don’t feel tired after doing it.
You can use a 15 year old pc or a top of the line gaming rig, then go to chatgpt website and request something, and the result would be the same, because it’s not your machine doing the work.
Now, you can indeed run local AI on your machine, and if you try, you’d quickly see that you need beefy hardware and that your power use would spike like crazy to deliver results that are way slower than what you’d get from using an app/website. Which makes it obvious that they’re using stronger (more) hardware than you are, and, therefore, using way more energy than you are.
“Blueberries are fucking purple!!” - Randy Feltface