You… you know... that LLMs... don't continue training when you use them... right? Like, the P in GPT stands for Pre-Trained?... When you chat with an LLM it doesn't retrain anything in the model because the model is a snapshot and....
oh nevermind why bother.
