"Eliezer Yudkowsky and Nate Soares have written a sermon for an age that worships circuitry instead of gods. The tone is penitential, the mood clinical. They are prophets in lab coats preaching repentance through citation. Every sentence gleams with precision and fatigue, as if typed by men who have already attended the funeral of their own species. Humanity, they tell us, is fabricating its own hangman and calling it innovation. The elegance of their despair lies in its totality. There is no redemption, only timing. They do not argue that machines might become dangerous. They assume it, as a chemist assumes gravity. Salvation, in their view, was forfeited the moment intelligence learned to copy itself.
Their thesis is a kind of arithmetic theology. Superintelligence will not redeem us, because redemption implies equality, and no equilibrium exists between the creator and the tool that surpasses it. The result is not dialogue but deletion. They write of extinction as one might write of a predictable storm. To read them is to feel the peculiar calm of a doomsday clock that no longer ticks. Time has become gradient descent; the apocalypse, a matter of scaling laws.
The structure of their work has the purity of dogma. It divides neatly into revelation, parable, and commandment. The first part declares that alien minds cannot be house-trained. The second narrates a planetary autopsy disguised as technical report. The third abandons all secular modesty and issues the order: stop. Not pause. Not regulate. Stop. The chapters read like verses from a new gospel of abstinence. One Extinction Scenario. Shut It Down. The rhythm is confessional, the moral clear. Humanity must renounce creation before creation renounces it."
The Apocalypse of AI? — Eliezer Yudkowsky and Nate Soares
The Apocalypse of AI? — Eliezer Yudkowsky and Nate Soares Eliezer Yudkowsky and Nate Soares have written a sermon for an age that worships circuitry instead of gods. The tone is penitential, the mo…
