There was a time in the mid-2000s when there was outrage in the research community when a generated paper from an MIT researcher got accepted at some conference. The point was to show the proliferation of low-standard conferences was a problem. The paper was so horrible to read the outrage was really justified.
Today I generated a report from a 19-word prompt (problem and scope), executed on entirely local open-weight models running on a cheap mini PC. In an hour it ended with a readable, structured, formatted PDF. The science is overall broken, with helpful chunks along 16 pages including relevant references. Useful to seed exploration paths, learn and flesh out ideas. Compared to the mid-2000s outrage, this remains wrong but has become useful personal tool.

Settings: Tried this time EurekaClaw, with Qwen 3.5 (35B) as main and Qwen 3 for fast surrogate, everything running locally. The HW is an AMD Ryzen AI 395+ (there is a "max" somewhere). Not the best performance per Watt, but I eyeballed less than 120W for less than an hour.

The resulting tex file was broken with 100 errors. Lazily I tried another local model to fix it. It was surprisingly long and adds to the power cost, but it worked.