A Brief, Incomplete, and Mostly Wrong History of Programming Languages
James Iry; Thursday, May 7, 2009

1801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry. Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization.

1842 - Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn't have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML.

1936 - Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them.

1936 - Alonzo Church also invents every language that will ever be but does it better. His lambda calculus is ignored because it is insufficiently C-like. This criticism occurs in spite of the fact that C has not yet been invented.

1940s - Various "computers" are "programmed" using direct wiring and switches. Engineers do this in order to avoid the tabs vs spaces debate.

1957 - John Backus and IBM create FORTRAN. There's nothing funny about IBM or FORTRAN. It is a syntax error to write FORTRAN while not wearing a blue tie.
1/6

1958 - John McCarthy and Paul Graham invent LISP. Due to high costs caused by a post-war depletion of the strategic parentheses reserve LISP never becomes popular[1]. In spite of its lack of popularity, LISP (now "Lisp" or sometimes "Arc") remains an influential language in "key algorithmic techniques such as recursion and condescension"[2].

1959 - After losing a bet with L. Ron Hubbard, Grace Hopper and several other sadists invent the Capitalization Of Boilerplate Oriented Language (COBOL) . Years later, in a misguided and sexist retaliation against Adm. Hopper's COBOL work, Ruby conferences frequently feature misogynistic material.

1964 - John Kemeny and Thomas Kurtz create BASIC, an unstructured programming language for non-computer scientists.

1965 - Kemeny and Kurtz go to 1964.

1970 - Guy Steele and Gerald Sussman create Scheme. Their work leads to a series of "Lambda the Ultimate" papers culminating in "Lambda the Ultimate Kitchen Utensil." This paper becomes the basis for a long running, but ultimately unsuccessful run of late night infomercials. Lambdas are relegated to relative obscurity until Java makes them popular by not having them.

1970 - Niklaus Wirth creates Pascal, a procedural language. Critics immediately denounce Pascal because it uses "x := x + y" syntax instead of the more familiar C-like "x = x + y". This criticism happens in spite of the fact that C has not yet been invented.

1972 - Dennis Ritchie invents a powerful gun that shoots both forward and backward simultaneously. Not satisfied with the number of deaths and permanent maimings from that invention he invents C and Unix.
2/6

1973 - Robin Milner creates ML, a language based on the M&M type theory. ML begets SML which has a formally specified semantics. When asked for a formal semantics of the formal semantics Milner's head explodes. Other well known languages in the ML family include OCaml, F#, and Visual Basic.

1980 - Alan Kay creates Smalltalk and invents the term "object oriented." When asked what that means he replies, "Smalltalk programs are just objects." When asked what objects are made of he replies, "objects." When asked again he says "look, it's all objects all the way down. Until you reach turtles."

1983 - In honor of Ada Lovelace's ability to create programs that never ran, Jean Ichbiah and the US Department of Defense create the Ada programming language. In spite of the lack of evidence that any significant Ada program is ever completed historians believe Ada to be a successful public works project that keeps several thousand roving defense contractors out of gangs.

1983 - Bjarne Stroustrup bolts everything he's ever heard of onto C to create C++. The resulting language is so complex that programs must be sent to the future to be compiled by the Skynet artificial intelligence. Build times suffer. Skynet's motives for performing the service remain unclear but spokespeople from the future say "there is nothing to be concerned about, baby," in an Austrian accented monotones. There is some speculation that Skynet is nothing more than a pretentious buffer overrun.
3/6

1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic.

1987 - Larry Wall falls asleep and hits Larry Wall's forehead on the keyboard. Upon waking Larry Wall decides that the string of characters on Larry Wall's monitor isn't random but an example program in a programming language that God wants His prophet, Larry Wall, to design. Perl is born.

1990 - A committee formed by Simon Peyton-Jones, Paul Hudak, Philip Wadler, Ashton Kutcher, and People for the Ethical Treatment of Animals creates Haskell, a pure, non-strict, functional language. Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that "a monad is a monoid in the category of endofunctors, what's the problem?"

1991 - Dutch programmer Guido van Rossum travels to Argentina for a mysterious operation. He returns with a large cranial scar, invents Python, is declared Dictator for Life by legions of followers, and announces to the world that "There Is Only One Way to Do It." Poland becomes nervous.

1995 - At a neighborhood Italian restaurant Rasmus Lerdorf realizes that his plate of spaghetti is an excellent model for understanding the World Wide Web and that web applications should mimic their medium. On the back of his napkin he designs Programmable Hyperlinked Pasta (PHP). PHP documentation remains on that napkin to this day.
4/6

1995 - Yukihiro "Mad Matz" Matsumoto creates Ruby to avert some vaguely unspecified apocalypse that will leave Australia a desert run by mohawked warriors and Tina Turner. The language is later renamed Ruby on Rails by its real inventor, David Heinemeier Hansson. [The bit about Matsumoto inventing a language called Ruby never happened and better be removed in the next revision of this article - DHH].

1995 - Brendan Eich reads up on every mistake ever made in designing a programming language, invents a few more, and creates LiveScript. Later, in an effort to cash in on the popularity of Java the language is renamed JavaScript. Later still, in an effort to cash in on the popularity of skin diseases the language is renamed ECMAScript.

1996 - James Gosling invents Java. Java is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. Sun loudly heralds Java's novelty.
5/6

2001 - Anders Hejlsberg invents C#. C# is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. Microsoft loudly heralds C#'s novelty.

2003 - A drunken Martin Odersky sees a Reese's Peanut Butter Cup ad featuring somebody's peanut butter getting on somebody else's chocolate and has an idea. He creates Scala, a language that unifies constructs from both object oriented and functional languages. This pisses off both groups and each promptly declares jihad.

Footnotes
[1] Fortunately for computer science the supply of curly braces and angle brackets remains high.
[2] Catch as catch can - Verity Stob
https://james-iry.blogspot.com/2009/05/brief-incomplete-and-mostly-wrong.html
6/6

A Brief, Incomplete, and Mostly Wrong History of Programming Languages

1801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry. Redditers of the time are not imp...

@dougmerritt it is a pity you couldn’t have worked in Anders work on Turbo Pascal and Delphi, because I was actually expecting it🤣
@wndxlori @dougmerritt wow Delphi. forgotten all about that, but I did spend several months with it.

@bocs @dougmerritt

Still going strong apparently. I kinda lost track after I left TeamB in the post-Borland days.

https://www.embarcadero.com/products/delphi

Delphi: IDE Software Overview - Embarcadero

Delphi is the fastest way to write, compile, package and deploy cross-platform native applications on Windows, macOS, iOS, Android and Linux. See more.

Embarcadero
@wndxlori @bocs @dougmerritt there's also Lazarus, which is more or less an open source equivalent of Delphi.
https://www.lazarus-ide.org/
Lazarus Homepage

Lazarus is a professional open-source cross platform IDE powered by Free Pascal

@rpbook @wndxlori @dougmerritt huh! cool.... will look at this, thx. absolutely no reason to go back into that quagmire though ;)

@bocs
I never learned that part of the world; what made Delphi a quagmire?

@rpbook @wndxlori

@dougmerritt @bocs @rpbook @wndxlori My lecturers trying to tell students that it was fine using Delphi and it was just Pascal but for Object Oriented Programming.

I still shudder 25 years later at the memory. I was perfectly happy learning Pascal.

@wndxlori @dougmerritt It was James Iry's blog post this came from!

(TP5.5 was my first compiler - at least in that sense of "first compiler")

@flippac
Not everyone noticed that I credited James Iry as the author, but they do of course deserve 100% of the credit.

@wndxlori

@dougmerritt @wndxlori Except what credit (or discredit) goes to Saunders Mac Lane, who actually did say "a monad is just a monoid in the category of endofunctors" in Categories for the Working Mathematician!
@wndxlori @dougmerritt First time I tried C#, I thought it was Delphi with begin … end replaced by { … }.
@jernej__s
I came to C# from years of Java, and was like … dude, you barely even chiseled off Sun’s copyright.
@wndxlori @dougmerritt You mean Anders' work replicating Apple & Wirth’s Clascal for Lisa (later renamed Object Pascal for Macintosh) and then adding CLOS-style slots to it? ;)

@dougmerritt Remember IBM's flirt with APL?

There are two things you must do
before your life is done:
Write two lines in APL
and make the buggers run.

😂 Good 🧵. You forgot PL/1, but it was probably before your time ☺️

@RonInDortmund
Ah yes, from "The Devil's DP Dictionary", by my friend, the late great Stan Kelly-Bootle; never shall we know his like again.
@dougmerritt Ha! Yes, almost as good as the original by Ambrose Bierce 👹📖
@RonInDortmund @dougmerritt Oh yes - APL with the Hieroglyphics 😵‍💫
@dougmerritt
I actually did laugh out loud. Fine work :)

@dougmerritt

Oh my, I’m still itching in ECMAscript grammars, (with nightmares from the days I wrote an entire sailing club membership maintenance system in APL).

Don’t talk about my expert system implementation in Ada.

@dougmerritt yep C#/.NET was MS flipping off Sun and the courts about their orig Java cloning efforts

@dougmerritt 😂😂😂😂👍🏾

And true!

@dougmerritt
Eagerly awaiting FORTH, nim, Modula 2, Prolog, Erlang, and Julia

@dlakelan

I'm still strung out on SNOBOL.

@dougmerritt

@dougmerritt
Nice. You forgot Algol, APL, and Forth.

@oneiros @dougmerritt

What about Brainfuck, Malbolge, Mondrian and Shakespeare????

#Whataboutism #Shitpost

@dougmerritt A colleague of mine ~1980 quit to become a contractor; the first job interview required facility in COBOL, which she read up on the night before. Just as the interviewer started asking knowledge questions, an employee came in asking for help with a program crash. Listening for a few seconds, she asked “did you check for end of file?” Got the job.

Her previous experience was with IBM where error messages always included a number. The decimal number she chose was, in hex, “B00BOOBAD”

@dougmerritt So funny! 🤣 - @siracusa might need to explain the Perl jokes. He and @marcoarment might appreciate the PHP one.

@dougmerritt

Early 1960s: Because conventional typewriters don't have enough special characters, Iverson invents APL, the least structured formal programming language ever devised. Due to this, it becomes a mark of honor to encode any APL program into a singleline.

1962-1967: In an attempt to invent Sendmail configuration files before Berkeley develops Sendmail, Bell Labs produces a series of SNOBOL-like languages including SNOBOL.

1968: Chuck Moore Forth invents telescopes control.

@dougmerritt this is outstanding and I've lol-snorted my tea over the table when I reached Larry wall. Aside from which, perl always has and will be my go to language to solve any problem.
@dougmerritt Biblically-accurate description of the genesis of Perl. Great thread!

@dougmerritt

Brilliant. Bin them all other than C and Perl, I say. Everything since is the same thing with a few bells and whistles bolted on.

Actually, FORTRAN was not bad, but then as documented elsewhere, Real Programmers can write FORTRAN in any language.

@dougmerritt That bit about Alan Kay and turtles is much more true than most people would think.

@dougmerritt

"1965 - Kemeny and Kurtz go to 1964."

Ha ha. Ha ha ha. Very funny. :p

@dougmerritt

Jacquard while credited with the invention of the first programmable industrial robot, notably did not make it because of the loss of weaver jobs that he realised would follow.

Someone else did and gave rise to the #luddites.

Jacquard was right.
He also invented a shitting duck robot.

#ai #labour #labor

@dougmerritt

I wrote FORTRAN at IBM wearing a Madras jacket. I doubt that I was wearing a blue tie but I believe that I was indeed required to wear some sort of tie. I was a summer intern*, and the jacket got me transferred to a less visible, and more congenial, unit. Just a summer job and I wound up writing GPSS for the bulk of it. It was quite pleasant.

My family was more APL-oriented but that seemed like a cult to me.

* Under a fellow named Paul Cross, at White Plains, mid 1960s.

@dougmerritt also: COBOL was designed explicitly to forbid humor at compile-time
@synlogic4242
Especially when not wearing a tie! (Ironic humorous ties don't count)
@dougmerritt lol! true story: the first professional programmer I knew, as a kid, was a COBOL programmer. for a bank. woman. a friend's mom. she was of that era where there was mostly just COBOL and Fortran for earning paychecks, and surprising percentage of the COBOL devs then were women. Fortran was a little more of the sausage fest. anyway she would often bring home printouts of the bank's COBOL code and we got to learn from it. I waaaay preferred C and Lisp

@synlogic4242
The most important threshold crossed by larval programmers is the point at which one knows enough to have strong preferences for (and against) the various programming languages at hand. 😉

When I learned the basics of COBOL, I wasn't at that point yet, so I merely found it interesting, rather than intensely annoying.

The whole idea that "if we make the language English-like, it'll be much easier for managers to understand what their employees are up to" was one hell of an example of wishful thinking.

@dougmerritt agreed. I'm reminded of the lessons of COBOL when I evaluate LLM prompting and vibecoders. I see the latter as magnifying the sins of the former.

however, SQL to me feels the most effective "done right" successor to spirit of COBOL. just a DSL where the domain is databases and transactions. critical to nail when it comes to defining, querying, mutating biz records reliably at scale. NAILED IT.

SQL has lived a long time so far and likely will more. whereas I see LLM prompting & vibecoding as fragile and undeterministic as hell -- a house of cards!

@synlogic4242
Definitely a house of cards, but that doesn't mean it's going away. There are clearly going to be large impacts on the global economy regardless of how well or poorly they hammer the square thingie into the round hole (or is it vice versa)

The immediate pain of skyrocketing prices and dwindling availability (of GPUs, RAM, hard drives, SSDs, etc) is just the precursor.

Sad times. LLMs *do* do a nice job of natural language, and people have always confused language and thought.

But I digress.

@dougmerritt agreed. LLMs seem to have been a huge leap forward in *simulating* interactive language conversations and just plain "understanding" (at shallow level, at least?) what they're asked. they just are also clearly deeply/derply broken in the brain -- ie. not thinking.

I used Gemini a lot earlier this year, and for like first week I was in honeymoon phase. months later I had run into all the clownshow headscratching stuff

agreed impact on hardware markets is bad in nearterm

I've been writing book on HPC (sloooowly in rare free time). I'm curious whether the AI boom demand will collapse before I publish it haha. soooooooo much compute inefficiency to try tuning, in that space, imo

@dougmerritt How 'bout Modula-2,, trying to morph Pascal into C with case sensitivity and upper-case keywords.

Despite that, the first program I wrote for money was in Modula-2!

@dougmerritt @wendynather Reading the Java one in the same manner that Arlo Guthrie describes the draft office really adds something… :)
@dougmerritt Fabulous. I was hoping that Whitespace would get a mention, but maybe the whole thread is a Whitespace program? https://en.wikipedia.org/wiki/Whitespace_%28programming_language%29
Whitespace (programming language) - Wikipedia

@dougmerritt that was brilliant