trying to figure out a title for this "how your CPU represents integers and floating point numbers" zine

so far the best idea is "inside the machine: integers and floating point" but that's not great

@b0rk Something I’ve been trying to figure out about big- and little-endian designs: The reason I’ve always felt weird about little-endian integers is that the most-significant bit is on the right and most-significant byte is on the left.

But… is that just because of the convention of writing left-to-right? Could you write a 16-bit little-endian int containing “1” as

10000000 00000000

?

Does the hardware fundamentally care or is it just writing convention?