

RE: https://mstdn.social/@TechCrunch/116223597370101052
Not sure I can envision AI replacing anyone anywhere at this point.
When it comes to software development, releasing bad code is releasing bad code. The developer is responsible.
If AI changes your opinion of what constitutes good code, you shouldn't be writing or releasing anything.
Be anti-closed AI, or if you're a publisher, be anti-unlicensed AI, or if you're worried about safety, think of regulations and strategies.
If you're worried about everyone losing their jobs, I wouldn't. If you're worried about environmental impact, know that it isn't economically viable to have a significant impact on the economy from AI, if only because of the previous statement I just made.
I worry at so many abandoning their tools and thus disempowering themselves and their cause. 'Anti-' as a stance cannot hold. It has no substance.
It is the anti-nuclear movement, the anti-vaxxing movement, it is a Congress that votes down every bill because they believe in nothing but tearing down imagined enemies.
When I first saw AI coming into its recent form, years before ChatGPT, I knew how dramatic the shift would be. I didn't run away from it. I faced my fears immediately. I knew I had to begin studying it to remain relevant after the tidal shift.
While there are many valid risks, while AI can't yet do even trivial reasonable coding on its own by any means, while it should never be consolidated and controlled by just a few, AI can be empowering and beneficial.
It is one thing to have a material objection with a thing (like AI). It is entirely another to have irrational delusions that hurt yourself, hurt others, and hurt your causes (anti vaxers).
We're at that point now. People that have unreasonable opinions about AI need to step back, take a breather, and figure out why they are fearful or angry.
Being blanket anti-AI is not a valid position to have. If you find yourself feeling that, it's time to step away, and stop staring into the void.