Local AI! Mini-LLM!
Currently, a large portion of the work can be done on an ancient laptop running Linux Mint, 16GB RAM, a 4B-Model and LLMStudio.
Who needs gigantic data-centers? Not I! ;0)
It's not the size of your tech that matters ... it's what you do with what you got