Nature issues its rules for use of LLMs in science papers:
1. ChatGPT can't be an author because it cannot be responsible.
2. Transparency.
Sensible.

https://www.nature.com/articles/d41586-023-00191-1

Tools such as ChatGPT threaten transparent science; here are our ground rules for their use

As researchers dive into the brave new world of advanced AI chatbots, publishers need to acknowledge their legitimate uses and lay down clear guidelines to avoid abuse.

@jeffjarvis
I think they need a zeroth rule... Validate the responses that ChatGPT gives you before all else.

IMNSHO that is the biggest issue with the use of ChatGPT right now, people are not verifying the responses to the questions they are asking. They are taking the results as fact.

Then they need a 3rd rule, go back and double check that the question you asked is really what you meant to ask. ChatGPT answers the question you ask, not the question you thought you asked.