Nature issues its rules for use of LLMs in science papers:
1. ChatGPT can't be an author because it cannot be responsible.
2. Transparency.
Sensible.

https://www.nature.com/articles/d41586-023-00191-1

Tools such as ChatGPT threaten transparent science; here are our ground rules for their use

As researchers dive into the brave new world of advanced AI chatbots, publishers need to acknowledge their legitimate uses and lay down clear guidelines to avoid abuse.

@jeffjarvis there is a transcript/screenshot of a ChatGPT session, where the user gets it to agree that 2+2=5 and others where it's "facts" are researched to be false.

My wife likes to point out that every line of a syllabus, in a policy, or on a notice has a story behind it.

With that in mind, this Nature rules change is chilling.

@GreatBigTable @jeffjarvis Did you mean that the circumstances requiring the rules change are chilling? I’d say the rules change itself is pretty much spot on!