Lol Neil Gaiman says
"ChatGPT doesn't give you information. It gives you information-shaped sentences."
This is one of the better ones I have seen.
Lol Neil Gaiman says
"ChatGPT doesn't give you information. It gives you information-shaped sentences."
This is one of the better ones I have seen.
"ChatGPT is bullshit": https://link.springer.com/content/pdf/10.1007/s10676-024-09775-5.pdf Title is clearly baity but the content is excellent. It focuses on the fact that LLMs goal "simply aim to replicate human speech or writing" not provide information or facts. Then they lay out different types of "bullshit" and determine if what LLMs produce could be fall into any of those types. TL;DR; yes, it does. This feels like a good model for how to think about LLMs. This is different if they are useful and how to use them if at all.