After thinking more about AI making programming obsolete.

To get the level of specificity needed to ask a computer to perform tasks will need dedicated jargon not just regular English. Which over time you will shorten with abbreviations and symbols for conciseness.

And then you’d have reinvented programming languages.

@carnage4life agreed, English is a bad enough language for writing specs in that humans then interpret let alone machines!

I suspect what you will be able to accomplish though is customisation or configuration of existing systems that do things close to what's needed. Essentially optimization or configuration tasks, modular assembly of functionality, process workflow etc. Which to be fair is a large amount of the work most software developers do today.

@bendelarre @carnage4life and then irony will strike with someone waking up one day with the idea of doing YAML config for your AI code generator...

@bendelarre @carnage4life

Any / every language involves approximation and interpretation. They’re maps, not territories.

To clarify… this short essay (book available online, fwiw), is a fascinating exploration of relationships between language and the world

https://en.wikipedia.org/wiki/The_Analytical_Language_of_John_Wilkins?wprov=sfti1

The Analytical Language of John Wilkins - Wikipedia

@carnage4life maddening to me that AI is largely controlled by command interfaces!
and these command interfaces have so few options (for now) that they're still largely word-salad slot machines!
@drewpickard @carnage4life This is not actually true. The AI you see - I guess AI search like chatgpt and perplexity - might be but we are iterating crazy fast rn. I have not seen this for a long time. This toot is already out of date. The kind of problem he is talking about is semantic routing, and it is not solved with 'dedicated jargon', it is same old pydantic and schemas. The facebook stuff llama is really good and OpenAI have many upstart challengers. Google seem lost however.
@nf3xn
what specifically is not actually true?
@carnage4life It depends on ones' definition of programming. Someone has to write the prompt that tells AI what to built. We alle know that programming is more than just coding. AI will become one extra tool in our toolbox, noting more and nothing less.
@alterelefant @carnage4life Full Ack - I already use GitHub Copilot frequently and it‘s a helpful tool, but for me it‘s more like the evolution of IntelliSense. You still have to architect the application, troubleshoot issues and have to review the code.
One thing where it is really helpful, is when using semi-familiar languages. I don‘t like PowerShell, but have to write scripts sometimes. That‘s much easier now, because I don‘t have to look up every minute detail…
@AUROnline @carnage4life Exactly that, it is another handy tool to increase productivity. AI will assist programmers, AI won't replace programmers.
@alterelefant @AUROnline @carnage4life
And we hope that said programmers are sufficiently able to detect when the AI spouts complete and utter bullshit that looks plausible at first glance.
@TheLancashireman @AUROnline @carnage4life The new tool needs babysitting.
@alterelefant @TheLancashireman @AUROnline @carnage4life I think, it‘s a question of expectation. If you expect flawless results and perfect knowledge, it will not work and they will disappoint. I don‘t expect them to be perfect, but to produce starting points. When writing texts, I personally find it easier to rework a starting point than to start from a blank page.
@alterelefant @carnage4life If it works the way AI works in localization: Yes, it will be one tool, but management will expect one worker to grab that tool and do the work of four people, meaning of course the other three get fired. Whether or not the tool is actually capable of replacing those people without significant loss of quality remains to be seen.
@Teskariel @carnage4life Management should have realistic expectations.
@carnage4life Just with a 'compiler' in between that is sometimes randomly doing not what your 'code' was written for 😱
@demiurg
Except that compilers are deterministic - at least, at the moment. Same input --> same output is normal. But watch out for "minor" changes in the input.

@carnage4life Yep, that’s it exactly. However, I still think many application scaffolds will be made with AI, because they are often not very novel.

When I was young I thought it would be possible to avoid constant reinvention of the wheel by making a catalog of abstractions that you could assemble like Lego bricks. (Yeah, I know 😂)

Then I started to think that was intractable for humans - even if you could assemble the catalog, how would you search it?

Now I think the answer might be AI.

@decoderwheel @carnage4life Is "I think the answer might be AI" the new "I'll use regular expressions and now I have two problems"? 😁

https://blog.codinghorror.com/regular-expressions-now-you-have-two-problems/

Regular Expressions: Now You Have Two Problems

I love regular expressions. No, I’m not sure you understand: I really love regular expressions. You may find it a little odd that a hack who grew up using a language with the ain’t keyword would fall so head over heels in love with something as obtuse and

Coding Horror
@ibboard @carnage4life Sure, but that’s coding for you. We never reduce complexity, we just move it around to where it’s more convenient. Until it’s inconvenient 😁

@decoderwheel @carnage4life When is what is currently called "AI" *not* inconvenient? 😉

At least regular expressions are interpretable and consistent. LLMs are neither.

@carnage4life Sometimes when I try to read complex legalese, I think to myself "This is what you get for trying to use English as a programming language".
@carnage4life Corollary: If a significant amount of your software product can be reliably developed via AI, many others have already developed that product. You are not unique or innovative. You’re merely practicing commodity arbitrage.
@carnage4life now I'm wondering if a neural-network-based (not an LLM or anything so wasteful and stolen) compiler would be a funny project... Probably not a good idea, but potentially funny

@carnage4life Dijkstra said much the same things in "On the foolishness of 'natural language programming'".

Form is useful! Logic is useful! If we need to tell computers how to carry out procedures, we *need* a way to specify those procedures in a way that lets them be understood, processed, and manipulated. That's what programming languages are, and that's what natural languages *aren't*!

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/EWD667.html

E.W.Dijkstra Archive: On the foolishness of "natural language programming". (EWD 667)

@carnage4life
A few years back there was a cartoon along these lines - I thought I'd saved it but I can't find it now, and I can't find it online either, which is disappointing.
@carnage4life @cstross yuuuup. This was basically my objection to the "visual programming tools" concept a couple decades back: writing the code isn't the hard part; saying what you want unambiguously for an extremely stupid machine to do it is the hard part, and that just isn't reducible (no matter how much the investor class wishes they could just stop having to pay all those pesky humans).
@carnage4life thank you for articulating this.

@matthewskelton @carnage4life It’s (more or less) the same logical flaw that’s repeated at least once a decade, by people who’ve never read No Silver Bullet. The hard part of coding is not typing. But I still think that a future generation of LLMs might be useful assistants.

(The current generation are dreadful and an environmental horrorshow to boot)

@carnage4life Ugh, transforming programming into lawyering with legalese. Well, there's a reason the term "language lawyer" exists.