Nature issues its rules for use of LLMs in science papers:
1. ChatGPT can't be an author because it cannot be responsible.
2. Transparency.
Sensible.
Nature issues its rules for use of LLMs in science papers:
1. ChatGPT can't be an author because it cannot be responsible.
2. Transparency.
Sensible.
@jeffjarvis there is a transcript/screenshot of a ChatGPT session, where the user gets it to agree that 2+2=5 and others where it's "facts" are researched to be false.
My wife likes to point out that every line of a syllabus, in a policy, or on a notice has a story behind it.
With that in mind, this Nature rules change is chilling.