#OTD in #ComputingHistory in 1959, IBM delivered the IBM 7090, one of the earliest fully transistorized mainframe computers. Built as the successor to the vacuum-tube based 709, the 7090 offered improvements in speed, reliability, and power consumption. https://www.acm.org/education/otd-in-computing-history
#OTD in #ComputingHistory in 1985, Microsoft launched its user interface, Windows. Many of us were first introduced to personal computing through Windows, remembering those early tiled windows and pull-down menus that defined the experience. More: https://www.acm.org/education/otd-in-computing-history
Ah, Lisp machines—the hipsters of computing, forever stuck in 1983, wearing retro rose-tinted glasses 👓. Watching these romantics cling to tech fossils is like witnessing earnest historians fanboy over Betamax tapes 📼. Someone tell them the 90s called and wants its outdated tech back. 😂
https://www.tfeb.org/fragments/2025/11/18/the-lost-cause-of-the-lisp-machines/ #LispMachines #RetroTech #TechFossils #ComputingHistory #90sNostalgia #HackerNews #ngated
The lost cause of the Lisp machines

I am just really bored by Lisp Machine romantics at this point: they should go away. I expect they never will....

How Slide Rules Work · Amen Zwa, Esq.

Back in the day, the TECO text editor had an Easter egg, purportedly the first digital one, where when asked to make love (i.e. make the file love), it would respond with not war? before proceeding as normal.

Someone needs to put this into the
make build utility. #programming #ComputerHistory #ComputingHistory
#OTD in #ComputingHistory in 1969, the first message was sent over what would become the Internet. That brief, two-letter message marked the beginning of networked communication as we know it today. Read more: https://www.acm.org/education/otd-in-computing-history
#OTD in #ComputingHistory in 1949, An Wang filed a patent for magnetic ferrite core memory, a breakthrough that transformed how computers stored data. For more computing milestones: https://www.acm.org/education/otd-in-computing-history

People used to bond multiple 56k modems together to squeeze more speed from dialup. In theory each modem adds another 56 kb per second so the total speed is N times 56 kb per second with no real protocol limit. In practice it quickly becomes messy. You need one phone line per modem, your provider must support multilink PPP, and overhead plus noise cuts the speed down. Most setups only used two to eight modems before cost and complexity made it pointless. At extremes you could run hundreds or even thousands of lines, but power, routing, and line quality would collapse performance. To reach one terabit per second you would need about eighteen million modems drawing roughly one hundred megawatts of power and filling a warehouse of copper. Technically possible in math, completely absurd in reality.

#RetroTech #Phreaking #Networking #Dialup #ComputingHistory

PhD position on the environmental impacts of AI at KTH Environmental Humanities Laboratory as part of a project funded by the research program Wallenberg AI, Autonomous Systems, and Software Program – Humanity and Society (WASP-HS).

The doctoral student will be part of the project “AI Planetary Futures: Climate and environment in Silicon Valley's AI paradigm”, which analyses the climate and environmental aspects of Silicon Valley-based general-purpose AI systems developed by Big Tech companies. It will specifically examine how companies' actions relate to their environmental impact, and the doctoral student will investigate how perceptions of human environmental impact change with accelerating AI implementation. The doctoral student will be affiliated with the WASP-HS Graduate School.

​The application deadline is the October 23, 2025.

https://www.kth.se/lediga-jobb/858323?l=en

#envhist #AI #computingHistory #climateChange #SiliconValley

KTH | PhD student in Environmental impact of AI within WASP-HS Graduate School

KTH jobs is where you search for jobs at www.kth.se.

@AmenZwa
"#OTD in #ComputingHistory in 1956, IBM released the first manual for the programming language FORTRAN."
https://mathstodon.xyz/@ACM@mastodon.acm.org/115379029173813774

BTW re: your book on improving Fortran: Have you implemented it? It sounds like an advantageous language. It's different enough that I'm not sure people would call it Fortran. I was kind of overwhelmed by the amount of thought in it; is it at all backward compatible?

One thing caught my eye that breaks things: row major versus column major: Fortran data is usually arranged to make column major be in sequential RAM and often in cache as well, since Fortran programmers are keenly aware of column major and impacts on runtimes of very long simulations.

Mathstodon