ahhhh, that is what the response to LLMs reminds me of: COVID! It's the same shrugs shoulders "COVID is endemic, we should get used to it" response.
For me AI is artificial intelligence. So, not an LLM, for example.
In this case the common use is based on a very big-time misunderstanding.
People assume that the LLM thinks. They think that what happens is that you explain the problem and get a solution.
What an LLM does is look for conversations where the same words as in the question (or their synonyms) appear and then outputs an answer that is typical for that question around the Internet.
It is based wholly on statistical analysis of words and includes no analysis of their meanings.