Turns out "what if AI gets too smart?" was the wrong question. "What if we just hand it all our keys and walk away?" was sitting right there. Classic enshittification arc: make it frictionless to use, make it catastrophic to secure, act surprised when the two meet. https://honnibal.dev/blog/clownpocalypse
The looming AI clownpocalypse · honnibal.dev

Exploits will soon be cheaper to develop autonomously than they earn. What then?