So apparently the source code of one highly valued, LLM-based code-production product has been inspected by external actors, and found to be extensive, but not exactly an impressive feat of engineering.
Reportedly it consists in some part of natural language directions that attempt to coax, beg, and beseech the text generator to produce desired kinds of output.

Has anyone coined the term "autocompleading" for this style of software development yet?

@hko I think the experience of interacting with LLMs would be much sillier if the developers weren't appending "but don't say anything bad, please, ok?" to the end of all of the prompts.
@hko we call it "middle-management LARP"

@hko It reminds me of the vibe coder parody video where the fix for everything is repeating "Fix it, or you go to jail" to the screen. https://youtu.be/JeNS1ZNHQs8?si=trqoPDMCAkxDWyTq

The thing I hate most about this timeline is that it's so dumb it's more or less parody-proof.

Interview with Vibe Coder in 2025

YouTube

@hko

They are totally prayers. How is this not prayers? 🧐

@hko "Autocompleading" is a fantastic word but I want to propose a different meaning: you know how after the LLM does the task, it suggests a bunch of next steps it *could* do if you only *asked* for it? And won't you please pretty please ask me to burn more tokens to do this next thing? There are so many things I could do if you'd like me too... Just let me know which you want!

I guess boiling lakes makes one thirsty.