No, computers won’t replace humans to write code for themselves.

Please stop with this nonsense.

What we will see though is tremendous losses in productivity as deskilled programmers will get less and less education and practice—and take longer and longer to make broken AI-generated code work. Meanwhile, AI models will regress from eating their own generated shit when being trained on.

Eventually AI companies will finally run out of investors to scam—and when they disappear or get so expensive they become unaffordable, “prompt engineers” will be asked to not use AI anymore.

What’s gonna happen then?

We’re losing a whole generation of programmers to this while thought leaders in our field are talking about “inevitability” and are jerking off to sci-fi-nostalgia-fueled fantasies of AGI.

@thomasfuchs

You make the flawed assumption that AI tech will not advance.
That has not been the case so far.
Not sure why you might think this since there is no evidence for that.

@n_dimension @thomasfuchs You make the flawed assumption that "AI tech" will advance, despite all the best training data already have being used.

Sure models may get cheaper to train, and there may be more handrails to prevent the worst outcomes. But even for a modest goal of 100x fewer incorrect results there is simply no path.

@glent @thomasfuchs

You have a source for your claim "best training data has been used"?
Because I can link you a paper about synthetic data being used for frontier models.

There is one thing that I have seen that is stronger than AI being "confidently incorrect" and that is AI antagonists that refuse to learn AI and base their opinions on information that is often more than a year old.
I am constantly in awe how sure AI antagonists are based on their mutually reinforced echo chamber of social posts.

@n_dimension @glent @thomasfuchs Don't worry, with the escalation on AI hungry resources, both AI companies and their users are going to also escalate the fucking up of the climate, so before it gets this good, we won't be able to breathe.

Or rather, we the people won't, disgusting rich assholes will have their secure places and a few mercenaries protecting them.

@Johns_priv @glent @thomasfuchs

They all have bunkers.
We have breached 7 of the 9 human life support parameters.

Worse yet, they all embrace #transhumanism which is an evil, fascist ideology once you scrape off the lipstick.

But the "#AI uses huge amount of power" is figurative nonsense. By the numbers, in 2024 *ALL DATACENTERS* used only about 0.2% global carbon emissions. And 50% of those are your Facebook's, iClouds and Dropboxes.
All the articles by experts so far I have read is "Percentage of percentage" avoiding actual numbers and appeals to authority (experts).

I understand that, you have to make it sound like an urgent problem. Because otherwise there will be no action. Just don't look at the data. Least you get challenged. Also everyone (especially the uniformed) assumes #aienergyconsumption will grow. That's kinda reasonable, but not the slope of that consumption. Given developments like Model distillation. There are 240,000+ models on #huggingface, some billions of parameters large, that you can run on a super beefy home PC. Not a data centre.

The really progressive #AI counter movement ought to be on #personalai and #regulateAI

I can't see AI antagonists smashing jet planes and #petroAgriculture which seems, in their style hypocritical and poorly informed.