Konrad

@retrocomputing
15 Followers
71 Following
76 Posts
IT professional, passionate about computing and gaming history.

Maybe 15 years ago I was recovering or modernizing some legacy F77 code and out of exasperation, I asked "Who taught engineers to code this badly?"

Through the magic of used bookstores and online sellers, I picked up maybe 15-20 textbooks on Fortran from the early 60s to about 1990 (1990 was long after the codes I was dealing with were written). I was determined to find a Smoking Gun in engineering computing education that explained the code quality I was seeing.

Instead, what I found almost universally was good, solid guidance on writing numerical code and writing code in general. Even back to McCracken's classic text on Fortran, the books all advocated good practice as it was known at the time.

It was about this time I was throwing myself into legacy code recovery and modernization (failing marriage, socially isolated, stranded far from home in Chicagoland with no job prospects back here in Texas. It wasn't pretty.). What I learned from reading old code and old texts was that what I was seeing as "bad" code was more accurately "inconvenient" code. The codes generally worked without error and were written in a way that worked around constraints in memory, CPU, architecture, environment, language, and state of the art. The problem was lack of long-term maintenance; the codes were optimized for constraints that no longer existed and the original developers had moved on to other projects and hadn't been replaced. This is not "technical debt", this is simple neglect. It's understandable though because many of these codes had lasted far longer than anyone expected them to and there's a strong culture of Don't Fix It If It Ain't Broke. Eventually environmental change catches up to the code and it breaks, especially if the maintainers are subject matter experts and don't update their toolchain periodically (or ever). Fortran is a surprisingly resilient language for being almost 80 years old and it still amazes me that 50 year old code builds and runs on modern systems with very little change needed. Part of that is a culture of backward compatibility and low maintenance requirements. By and large, the legacy codes I see are written reasonably well for their era. My view on what I consider "bad" code has tempered over the past 15 years as I've done more research into the coding practices, instruction, and hardware as they have developed from the late 1950s to present. We dont generally notice the overhead of function calls today, both because we have an abundance of resources and because modern compilers are insanely good at optimization. That was not the case for a long time so when you see long (10+ page) routines in these codes, it helps to understand that there was a non-negligible cost to creating and calling functions and that there may have been limitations on the number of functions a compiler (or linker or operating system) could support. I can't tell you what it was like to program a low-end IBM 360 in 1968 - it likely involved punch cards and F66 but I don't have a feel for memory space, processor and I/O speed, tape storage, or operating environment. My first computer was a VIC-20 with slightly less than 4kB of usable RAM and cassette tape storage. I can tell you about writing code to fit into 3583 bytes, load times from cassette, and the joys of a 1MHz 8-bit processor. Put aside the nostalgia - try writing using a CP/M system today to understand limitations (I highly recommend the @rc2014 for real hardware) or use simh for emulating older mainframe or timesharing systems https://opensimh.org/). Again, this is about understanding constraints that no longer exist in modern desktop machines.

This extends to toolchains - modern IDEs and compilers make easy work of developing and debugging. There's a very practical reason why global variables fell out of favor, not for some theoretical or pedagogical reason but because they led to errors and broken code. Debugging 40 years ago was much harder and you see guidance on good practice and debugging in texts from the early 60s.

So when I see a modern popular text on R hand-wave away concerns about globals as academic, I immediately distrust the programming culture around R (and no, it's not just one book by one author; look up variable scope and environments in "R in Action".)

R is a very very useful interactive desktop tool. Think of it as bash but for statistics and data analysis. I can see its value. But as a programming language and a programming culture it's hot garbage. Languages are more fixable than culture.

The Open SIMH Project

Open SimH - Bringing Antiquities back to life

Open SIMH

๐ŸŽ‰ Itโ€™s FOSS turned 13 today!

13 years of spreading Linux knowledge, supporting the community, and promoting open source.

โค๏ธ Thank you for being a part of this journey โ€” together, weโ€™ll keep growing! ๐Ÿ’ชโœจ

#itsfoss #opensource #linux

Fabien Sanglard published a blog series on driving C compilers, i.e. running the compiler toolchain to build executable programs:

https://fabiensanglard.net/dc/index.php

More recently Julia Evans @b0rk posted on the related topic of using Make to compile C programs, which nicely complements Fabien's series:

https://jvns.ca/blog/2025/06/10/how-to-compile-a-c-program/

#clang #gcc #compilers

Driving Compilers

Doctors Could Hack the Nervous System With Ultrasound

<p>A new stimulation technique targets inflammation and diabetes</p>

IEEE Spectrum

New computer with Windows pre-installed? Want to install #Linux ๐Ÿง instead?

You pay for #Windows, even if you donโ€™t use it. Thatโ€™s unfair and non-transparent.

#Refund4Freedom from @fsfe & @ItaLinuxSociety defends your right to get refunds for unused pre-installed software! ๐Ÿ˜Ž

https://refund4freedom.org/

The campaign starts in #Italy ๐Ÿ‡ฎ๐Ÿ‡น but will later be extended.

FSFE & ILS support your right to choose your operating system. They also support #EndOf10 to prevent e-waste!

#GetYourWindowsRefund

@Daojoan Do not need anything, just โ€” thank you for doing this. Paying it forward is a great way to go, and makes the world a better place.
I wrote a python program to send out Death By Scrolling steam keys. All the keys are in a sqlite database and I pass the program an email address and it picks a unused key and sends an email with the key and instructions. I'll start sending out keys tomorrow. A few at first then more and more at random. I might pause sending keys if we do big retooling or bug fix. You can only make a first impression once. I can't guarantee everyone will get in. Wish us luck.
Who called it "cloud identity provider" when they could have called it "Attack Surface as a Service"
It's the time of year when I get to be super #solarpunk - cutting back the vegetation that's starting to shade the #solar panels that run the solarcene server.