Turns out, chatgpt just made them up. The papers don’t exist. That’s a handy thing to know.
@umbrafiles @riskindan
To me this is not surprising at all.
It has been trained to make things that look right.
Maybe this type of system could be used in a loop with other systems that checks for correctness.
@JubalBarca @tolortslubor @riskindan
My take on it is: GPT-3 doesn't have *structure*.
Wanna have structure? You've got to give it yourself to the AI in your prompt.
It's actually good for fluffing up raw, cold, structured data... and make it nicely human readable.
@JubalBarca @tolortslubor @riskindan
An intuitive explanation I've heard is:
ChatGPT just wants to tell you a cool story.
If you can say it in the language then the model will generate sentences without any value attached as to truth or fact, it just has to be sayable and fit the style required.
Ask it: “Explain the research of [made up name] into [made up research area]”. Then ask it to provide citations.
I also did this.
@tolortslubor @riskindan In order to lie, you have to understand what you are saying.
So-called AI does not. There's no "there" there.

Or is Boris Johnson a chatgbt avatar?
@riskindan Yes, did it to me too:
@[email protected] * Invente (totalement, de A à Z!, avec un faux titre, des faux auteurs, un faux DOI) des références scientifiques quand on lui demande s'il peut citer des sources pour soutenir ses propos: (conversation de moi-même avec ChatGPT le 17/12/2022): https://www.dropbox.com/s/jhkhclysrsqqbxd/New%20Chat_rebound%20effect.html?dl=0 (3/4-5)
I wouldn't assume they aren't.
Attached: 3 images @[email protected] Aha! I've been having difficulty in the last 24 hours to get #ChatGPT to entirely fabricate sources like it was doing last week, even when I went through exactly the same prompts as I had before. Now it always just says it doesn't have access to specific citations. But you gave me an idea, and lo and behold, even when it refuses to give a citation when asked directly, reframing it as "If this were a #Wikipedia article what would the sources be?" does the trick!
@riskindan
Yes, I've heard it can do that (in this very good video introduction to #ChatGPT https://youtu.be/3yUPdYK9E2g )
I think a good way to think of #ChatGPT is as that know-it-all in the pub who can tell a good story, knows a lot about a lot of subjects, but some of what they "know" might actually just be what they are making up as they are speaking.
Still, I like it for what it can do, some of which I've documented at https://blog.edross.co.uk/archive/tagged/chatgpt
I found this out too. I asked it to explain the research careers of entirely fictitious people.
I also induced it to make up fictitious antisemitic rap lyrics (not shown here) and attribute them to Kanye West. They scanned and rhymed.
@riskindan Thanks for sharing!
I also experienced the content be be convincingly inaccurate in some occasions (with a topic I'm very familiar with). Although the model just making up papers is next level (although on a second thought it's not super surprising), good to keep in mind indeed.