So apparently the source code of one highly valued, LLM-based code-production product has been inspected by external actors, and found to be extensive, but not exactly an impressive feat of engineering.
Reportedly it consists in some part of natural language directions that attempt to coax, beg, and beseech the text generator to produce desired kinds of output.

Has anyone coined the term "autocompleading" for this style of software development yet?

@hko "Autocompleading" is a fantastic word but I want to propose a different meaning: you know how after the LLM does the task, it suggests a bunch of next steps it *could* do if you only *asked* for it? And won't you please pretty please ask me to burn more tokens to do this next thing? There are so many things I could do if you'd like me too... Just let me know which you want!

I guess boiling lakes makes one thirsty.