This proposes a way of using AI agents to produce research. Ok. But this bit is a pipe dream: "And human scientists should retain authority over — and responsibility for — framing the question, validating the path and signing off on conclusions." Here's why... /1

RE: https://bsky.app/profile/did:plc:jviud2kbpxo3lwd3do4mqepg/post/3mghvizp6x22k
As soon as you start down this road, the volume of output - not just code and logic, which they describe, but results and conclusions - immediately surpasses the human capacity to read and assess it. And the people running such a process are still driven by our current institutional incentives. /2
They fall in love with the process, trust it too much, and start rubber-stamping the results. Some scientists already "co-author" literally more papers than they have time to read. What is this agent-driven process going to do to our ACTUALLY EXISTING SCHOLARLY COMMUNICATION SYSTEM? Destroy it. /3
The units of output are still "papers," and these processes immediately produce more than anyone EXCEPT MACHINES can read and evaluate. The authors of this paper - if they haven't already - will be proposing agentic "peer review" and publication tomorrow when they realize that's the only option. /4
AI agents can "do" science like they propose, but the idea that it will be supervised and assessed by humans is a dangerous myth. It can't happen. This is why so many of the AI "declarations" I already see on papers are bullshit. /5
The authors skim the results, claim they are "responsible" for them - and then press a button to submit the paper, immediately creating an untenable burden for the real humans trying to run a preprint server or journal - and turn to "writing" the next one. /6
I see many "researchers" who produce multiple "papers" per week - and just dump them in the commons. The volume is too great to review and assess. Something has to give. Where we are going at present is that human review, by reviewers and even "authors" themselves, is what's collapsing. /7
We needed to redesign scholarly communication (procedures, units of output, incentives, publication models) before this happens - and it's already too late. These people are crushing the system and blaming the victims for not being prepared to handle the weight of their output. /8
AI peer review and publication of "papers" is already happening on a rapidly growing scale. When the publication process is turned over the prompt engineers, inevitably reading itself follows, and then the social process of responding to and acting on scientific results will, too. /9
tldr: Without scholarly communication -- which is, after all, SCHOLARS COMMUNICATING -- science as a social system dies in darkness. /10