Nature issues its rules for use of LLMs in science papers:
1. ChatGPT can't be an author because it cannot be responsible.
2. Transparency.
Sensible.
Nature issues its rules for use of LLMs in science papers:
1. ChatGPT can't be an author because it cannot be responsible.
2. Transparency.
Sensible.
@jeffjarvis there is a transcript/screenshot of a ChatGPT session, where the user gets it to agree that 2+2=5 and others where it's "facts" are researched to be false.
My wife likes to point out that every line of a syllabus, in a policy, or on a notice has a story behind it.
With that in mind, this Nature rules change is chilling.
@jeffjarvis
I think they need a zeroth rule... Validate the responses that ChatGPT gives you before all else.
IMNSHO that is the biggest issue with the use of ChatGPT right now, people are not verifying the responses to the questions they are asking. They are taking the results as fact.
Then they need a 3rd rule, go back and double check that the question you asked is really what you meant to ask. ChatGPT answers the question you ask, not the question you thought you asked.