Turns out, chatgpt just made them up. The papers don’t exist. That’s a handy thing to know.
@JubalBarca @tolortslubor @riskindan
My take on it is: GPT-3 doesn't have *structure*.
Wanna have structure? You've got to give it yourself to the AI in your prompt.
It's actually good for fluffing up raw, cold, structured data... and make it nicely human readable.
@JubalBarca @tolortslubor @riskindan
An intuitive explanation I've heard is:
ChatGPT just wants to tell you a cool story.
If you can say it in the language then the model will generate sentences without any value attached as to truth or fact, it just has to be sayable and fit the style required.
Ask it: “Explain the research of [made up name] into [made up research area]”. Then ask it to provide citations.
I also did this.
@tolortslubor @riskindan In order to lie, you have to understand what you are saying.
So-called AI does not. There's no "there" there.