The thing is, *even if* LLMs made me produce code 25% faster like they claim (and it doesn’t), it would still be a net negative even without all the costs (direct and indirect) simply because a human wouldn’t have the innate understanding of the code that comes with having written it, which short-circuits so much later. Most of coding time is NOT producing the initial version. We’ve known this for decades. It doesn’t matter how much people want that not to be true https://mikelovesrobots.substack.com/p/wheres-the-shovelware-why-ai-coding
@sinbad Good point! I don't see all this hype about LLMs making you code faster. Most developers only spend a small portion of their time on actual coding, and even less so for writing the initial version of the code (as you mentioned).
If the only goal is to save time writing code then you'll likely develop your skills/experience less if you blindly rely on LLMs for that.
However, if you instead aim at spending more time on coding using by LLMs then it could be possilble that they can help you write better code or learn and understand things better. Something that might actually pay off. But unfortunately that's not at the centre of the ongoing hype..
I mainly see two types of use cases for LLMs in code writing these days:
1. I need to write lots of boilerplate code, fast. If that's the case then you have a problem though.
2. I don't understand this language/maths/algorithms well enough, and want the LLM to write the code. Then you have another problem though: Until we have super-advanced AI that can replace human beings in every way, then you don't want your developers to deal with code and languages they don't understand.
You could instead try to use LLMs to help you brainstorm, understand algorithms/maths, search documentation and libraries and come up with good solutions - but that's not a matter of saving time, which is the only thing people seem to care about.
