The reason I get so annoyed about people pitching LLMs as a way to 'democratise programming' or as end-user programming tools is that they solve the wrong problem.

The hard part of programming is not writing code. It's unambiguously expressing your problem and desired solution. Imagine if LLMs were perfect programmers. All you have to do is write a requirements document and they turn it into a working program. Amazing, right? Well, not if you've ever seen what most people write in a requirements document or seen the output when a team of good programmers works from a requirements document.

The most popular end-user programming language in the world (and, by extension, the most popular programming language), with over a billion users, is the Calc language that is embedded in Excel. It is not popular because it's a good language. Calc is a terrible programming language by pretty much any metric. It's popular because Excel (which is also a terrible spreadsheet, but that's a different rant) is basically a visual debugger and a reactive programming environment. Every temporary value in an Excel program is inspectable and it's trivial to write additional debug expressions that are automatically updated when the values that they're observing change.

Much as I detest it as a spreadsheet, Excel is probably the best debugger that I have ever used, including Lisp and Smalltalk.

The thing that makes end-user programming easy in Excel is not that it's easy to write code, it's that it's easy to see what the code is doing and understand why it's doing the wrong thing. If you replace this with an LLM that generates Python, and the Python program is wrong, how does a normal non-Python-programming human debug it? They try asking the LLM, but it doesn't actually understand the Python so it will often send them down odd rabbit holes. In contrast, every intermediate step in an Excel / Calc program is visible. Every single intermediate value is introspectable. Adding extra sanity checks (such as 'does money leaving the account equal the money paid to suppliers?') is trivial.

If you want to democratise programming, build better debuggers, don't build tools that rapidly generate code that's hard to debug.

@david_chisnall this isn't even a new phenomenon. Before there was vibe coding, there were "NoCode solutions". The problem is always the same: either the result is janky, limited, and/or not up to spec, or the person creating it has inadvertently become a programmer, with all the complexity that entails.

In the case of NoCode, this was mostly a way to underpay programmers, by not calling them that. I expect similar in the case of LLMs.

@sophieschmieg @david_chisnall And before that - COBOL

"COBOL is a language intended for use by non-programmers and, by extension, for non-programs"

@jackeric @sophieschmieg @david_chisnall Except COBOL works and was specifically designed for its problem domain. When FORTRAN was introduced, it got the same criticism - it deskilled programming. Like COBOL it was aimed at subject matter experts, not professional programmers.

COBOL is a weird and wordy language, an evolutionary dead end that nevertheless thrives in its ecological niche. Like FORTRAN, it is not a general purpose language but it is reliable and good enough at its job that it's difficult to replace. Neither are interesting to CS academics and that goes double for Excel.

I get it, it's a joke. No respectable CS grad would touch Excel, COBOL, or Fortran with a ten foot pole. They're inelegant and vulgar and contemplate because they haven't been displaced and eradicated by CS-oriented general purpose languages.

I am reminded that Jean Sammet (a principal developer of COBOL along with Grace Hopper) basically saved the ACM from insolvency in the early 1970s. Professional computing owes more to COBOL than it realizes.

@arclight @jackeric @sophieschmieg @david_chisnall Hmm? Fortran absolutely can be a general purpose language, it's just mainly used by certain sectors. And I'm a CS academic who did spend years on Fortran program analysis. Can't say I'm respectable though 😏
@md @jackeric @sophieschmieg @david_chisnall I mean, it mostly is now. It supports recursion, array operations, OO, pointers, and various forms of parallel programming without a ton of fiddly overhead. The standardized interface to C makes cross-language programming pretty straightforward and lets you link in a lot the native language doesn't support (or supports with difficulty). Still, I'd recommend a different language for general use, especially if you want a rich standard library, decent text processing, networking support, enums, or generics. It's still not a bad choice for numerically intensive code - it's low-magic, extremely stable, fast, and straightforward for subject matter experts to grasp. Saying good things about it in public will not get you invited to any parties however.
@arclight @jackeric @sophieschmieg @david_chisnall Yup, I'd rather that scientists continue to use Fortran than something dastardly like C++. I'm definitely not getting invited to any parties.

@arclight @jackeric @sophieschmieg @david_chisnall FORTRAN does have one point of CS interest - it (uniquely?) doesn't have the lexical analysis layer between the individual characters and the syntax, thus allowing for the famous

DO 10 I = 1.10