The first dose is free...
The first dose is free...
Local AI sounds nice, but you need expensive* hardware and the quality lags 6 to 12 months behind commercial AI – at least in simple terms.
I’m looking forward to seeing them; your comics are simply brilliant.
* Prices for local AI hardware are actually falling, at least until the next hardware price surge hits.
@sam4000 @davidrevoy prices of hardware has been rising alongside RAM, at least in the used market which is the only place I regularly check. My GPU that cost me 600€ more than a year ago is 750-800€ now. I don't know what are the falling prices you mention.
LLMs of any size they're all shit, except for a few particular tasks, most of which don't require a GPU this expensive (or any GPU at all, if you have some patience). I keep a very small LM loaded at all times for image transcriptions and things like that. Most of my VRAM is used with Blender or VR stuff. I _could_ load much bigger models (when I'm not using the GPU for graphics) but I don't really need to.
Yes, that’s what I meant in the second part.
In the past, you needed an expensive server GPU; today, you can use expensive consumer hardware. That’s why I mentioned the first part.