A Brief, Incomplete, and Mostly Wrong History of Programming Languages
James Iry; Thursday, May 7, 2009

1801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry. Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization.

1842 - Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn't have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML.

1936 - Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them.

1936 - Alonzo Church also invents every language that will ever be but does it better. His lambda calculus is ignored because it is insufficiently C-like. This criticism occurs in spite of the fact that C has not yet been invented.

1940s - Various "computers" are "programmed" using direct wiring and switches. Engineers do this in order to avoid the tabs vs spaces debate.

1957 - John Backus and IBM create FORTRAN. There's nothing funny about IBM or FORTRAN. It is a syntax error to write FORTRAN while not wearing a blue tie.
1/6

@dougmerritt also: COBOL was designed explicitly to forbid humor at compile-time
@synlogic4242
Especially when not wearing a tie! (Ironic humorous ties don't count)
@dougmerritt lol! true story: the first professional programmer I knew, as a kid, was a COBOL programmer. for a bank. woman. a friend's mom. she was of that era where there was mostly just COBOL and Fortran for earning paychecks, and surprising percentage of the COBOL devs then were women. Fortran was a little more of the sausage fest. anyway she would often bring home printouts of the bank's COBOL code and we got to learn from it. I waaaay preferred C and Lisp

@synlogic4242
The most important threshold crossed by larval programmers is the point at which one knows enough to have strong preferences for (and against) the various programming languages at hand. 😉

When I learned the basics of COBOL, I wasn't at that point yet, so I merely found it interesting, rather than intensely annoying.

The whole idea that "if we make the language English-like, it'll be much easier for managers to understand what their employees are up to" was one hell of an example of wishful thinking.

@dougmerritt agreed. I'm reminded of the lessons of COBOL when I evaluate LLM prompting and vibecoders. I see the latter as magnifying the sins of the former.

however, SQL to me feels the most effective "done right" successor to spirit of COBOL. just a DSL where the domain is databases and transactions. critical to nail when it comes to defining, querying, mutating biz records reliably at scale. NAILED IT.

SQL has lived a long time so far and likely will more. whereas I see LLM prompting & vibecoding as fragile and undeterministic as hell -- a house of cards!

@synlogic4242
Definitely a house of cards, but that doesn't mean it's going away. There are clearly going to be large impacts on the global economy regardless of how well or poorly they hammer the square thingie into the round hole (or is it vice versa)

The immediate pain of skyrocketing prices and dwindling availability (of GPUs, RAM, hard drives, SSDs, etc) is just the precursor.

Sad times. LLMs *do* do a nice job of natural language, and people have always confused language and thought.

But I digress.

@dougmerritt agreed. LLMs seem to have been a huge leap forward in *simulating* interactive language conversations and just plain "understanding" (at shallow level, at least?) what they're asked. they just are also clearly deeply/derply broken in the brain -- ie. not thinking.

I used Gemini a lot earlier this year, and for like first week I was in honeymoon phase. months later I had run into all the clownshow headscratching stuff

agreed impact on hardware markets is bad in nearterm

I've been writing book on HPC (sloooowly in rare free time). I'm curious whether the AI boom demand will collapse before I publish it haha. soooooooo much compute inefficiency to try tuning, in that space, imo