Happy Mainframe Day!
OTD 1964: IBM announces the System/360 family. 8-bit bytes ftw!
Shown: Operator at console of Princeton's IBM/360 Model 91.
Happy Mainframe Day!
OTD 1964: IBM announces the System/360 family. 8-bit bytes ftw!
Shown: Operator at console of Princeton's IBM/360 Model 91.
@aka_pugs Really was the beginning of the modern era of computing, starting with the normalisation of 8-bit bytes and character addressable architecture.
Well, that's all true so long as we don't mention EBCDIC 🙂
@aka_pugs @markd ASCII mode was only about how some of the decimal arithmetic instructions behaved. For the printers, the character set was pretty arbitrary, and the Translate instruction would have allowed for easy compatibility no matter what. The real EBCDIC issue was the card reader—and per Fred Brooks, IBM wanted to go with ASCII but their big data processing customers talked them out of it.. But that's a story for another post. (And 8-bit bytes? Brooks felt that 8-bit bytes and 32-bit words was one of the most important innovations in the S/360 line. It wasn't a foregone conclusion—many scientific computing folks really wanted to stick with 36-bit words, for extra precision. IBM ran *lots* of simulations to assure everyone that 32 bit floating point was ok.)
Why yes, in grad school I did take computer architecture from Brooks…
@SteveBellovin @aka_pugs If you were on the non-EBCDIC side of the fence you got the impression that IBM sales pushed EBCDIC pretty hard as a competitive advantage - even if their engineering covertly preferred ASCII.
The 32-bit word must have been a harder-sell for the blue suits since the competition were selling 60bit and 36bit amongst other oddballs.
Fortunately the emergence of commercial customers marked the declining relevance of scientific computing... Did IBM get lucky or were they prescient?
But yeah, the S/360 definitely marked the end of the beginning of computing in multiple ways.
@markd @SteveBellovin IBM had a huge lead in commercial data processing because of their punch card business. And that world did not care about floating point. The model 91 was an ego-relief product, not a real business. IMO.
Data processing and HPC markets never converged - until maybe AI.