Stubsack: weekly thread for sneers not worth an entire post, week ending 1st March 2026
Stubsack: weekly thread for sneers not worth an entire post, week ending 1st March 2026
I like how even by ACX standards scoot’s posts on AI are pure brain damage
One level lower down, your brain was shaped by next-sense-datum prediction - partly you learned how to do addition because only the mechanism of addition correctly predicted the next word out of your teacher’s mouth when she said “three plus three is . . . “ (it’s more complicated than this, sorry, but this oversimplification is basically true). But you don’t feel like you’re predicting anything when you’re doing a math problem. You’re just doing good, normal mathematical steps, like reciting “P.E.M.D.A.S.” to yourself and carrying the one.
The most compelling analogy: this is like expecting humans to be “just survival-and-reproduction machines” because survival and reproduction were the optimization criteria in our evolutionary history. […] This simple analogy is slightly off, because it’s confusing two optimization levels: the outer optimization level (in humans, evolution optimizing for reproduction; in AIs, companies optimizing for profit) with the inner optimization level (in humans, next-sense-datum prediction; in AIs, next-token prediction). But the stochastic parrot people probably haven’t gotten to the point where they learn that humans are next sense-datum predictors, so the evolution/reproduction one above might make a better didactic tool.
He also threatens an Anti-Stochatic-Parrot FAQ. Here’s hoping if this happens Bender et al enthusiastically point out this is coming from a guy whose long term master plan is to fight evil AI with eugenics.