#TIL Today, we all write hexadecimal numbers using the letters "A"-to-"F", this standard actually came from the #IBM S/360. Most earlier computers used octal, S/360 used hex, and because IBM used 'A'-to-'F', everyone followed.

https://retrocomputing.stackexchange.com/questions/15099/was-the-ibm-s-360-responsible-for-popularizating-the-a-to-f-notation-in-hexa

#retrocomputing #s360

Was the IBM S/360 Responsible for Popularizating the 'A'-to-'F' Notation in Hexadecimal Numbers?

In the early history of computing before the mid-1960s, there wasn't an universal, de-facto standard for the written representation of a hexadecimal number, different computer systems used their own

@niconiconi so there were other options besides A-F for 10-15? Vowels? Roman numerals? Diacritics like 0' 1'?
Hexadecimal - Wikipedia

@niconiconi wow. I just realized how non-universal 0x is as a hex prefix.

@tomosaigon @niconiconi I tend to use $ in assemblers that support it because it's a prefix (postfixes don't scan as well) and it's one character instead of two.

And I got started in Assembler with 8-bit micros, where that's the convention...

@thoth @niconiconi I blocked out memories of z80/x86 asm. Apparently, they support (depending on the assembler) multiple syntaxes for hex: 0x, $0x, $, h (suffix).

It's probably more popular these days to use language where $ denotes a variable rather than a literal!

@tomosaigon @niconiconi $0x... o.O

Better Z80 than 8086 though. Just thinking about x86 segmentation makes me wince.

@thoth @tomosaigon linear address space was a real luxury back then.
@niconiconi @tomosaigon Yeah but the Z8k and 65816 handled segmentation better. And of course the 68k, which was more popular than the 8086 for a while, was totally flat.
@niconiconi nice history nugget. Whether the S/360 was the first, it also seems the natural choice that bases beyond decimal borrow alphabetical sequence rather than invent special characters to represent those digits.