Is anybody else bored of talking about AI?
https://blog.jakesaunders.dev/is-anybody-else-bored-of-talking-about-ai/
Is anybody else bored of talking about AI?
https://blog.jakesaunders.dev/is-anybody-else-bored-of-talking-about-ai/
This might sound like snark, but I truly don’t mean it that way.
I think what’s interesting about AI, and why there’s so much conversation, is that in order to be a good user of AI, you have to really understand software development. All the people I work with who are getting the most value out of using AI to deliver software are people who are already very high-skilled engineers, and the more years of real experience they have, the better.
I know some guys who were road warriors for many years —- everything from racking and cabling servers, setting up infrastructure, and getting huge cloud deployments going all the way to embedded software, video game backends, etc. These guys were already really good at automation, seeing the whole life cycle of software, and understanding all the pressure points. For them, AI is the ultimate power tool. They’re just flying with it right now. (All of them also are aware that the AI vampire is very real.)
There’s still a lot to learn, and the tools are still very, very early on, but the value is clear.
I think for quite a few people, engaging with AI is maybe the first time ever in their entire career they are having to engage with systems thinking in a very concrete and directed way. Consequently, this is why so many software engineers are having an identity crisis: they’ve spent most of their career focusing on one very small section of the overall SDLC, meanwhile believing that was mostly all there was that they needed to know.
So I think we’re going to keep talking for quite a while, and the conversation will continue to be very unevenly distributed. Paradoxically, I’m not bored of it, because I’m learning so much listening to intelligent people share their learnings.
Spot on take. The people I’ve noticed that say things like “it’s not useful” are the ones who are doing so little they can’t see the value.
This isn’t to say there’s not hype. Just that if you’re not seeing big productivity gains you need to make sure you really are an outlier and not just surplus to requirements.
I rarely come across people who flat out say "it's not useful". They exist, but IME they're the minority.
Rather, I hear a lot of nuanced opinions of how the tech is useful in some scenarios, but that the net benefit is not clear. I.e. the tech has many drawbacks that make it require a lot of effort to extract actual value from. This is an opinion I personally share.
In most cases, those "big productivity gains" are vastly blown out of proportion. In the context of software development specifically, sure, you can now generate thousands of lines of code in an instant, but writing code was never the bottleneck. It was always the effort to carefully design and implement correct solutions to real-world problems. These new tools can approximate this to an extent, when given relevant context and expert guidance, but the output is always unreliable, and very difficult to verify.
So anyone who claims "big productivity gains" is likely not bothering to verify the output, which in most cases will eventually come back to haunt them and/or anyone who depends on their work. And this should concern everyone.