I'm a little late for #throwbackthursday in my timezone, but it's still Thursday west of here, so...

Here's the cover of the instruction set manual for the first microprocessor I learned to program. In hand-assembled binary. From code written on column-grid paper.

("Kids today! We used to have to carve our programs into stone tablets! And all we had were tall and short ones because zero wasn't invented yet!")

@syscrusher I had to make my first steps into the world of microcontroller programming with #8051 in the mid 2000s. After this semester writing assembly programs for the 8051, I really learned to value higher languages like C or C++.

@mmeese The #8051 was popular even when I interned at Intel in the 1980s. My boss told me that although the spiffy new 80286 was making headlines, the company's real bread and butter was the humble, field-proven 8051.

I'm pretty sure the 8051 core is still widely used today, as a component in FPGAs and SoC designs. It may be old tech, but just how smart or fast do you need a CPU to be for a kitchen appliance?

I, too, learned the value of compilers by coding without them. That said, I think learning how computers work BEFORE learning how to program them is STILL the best way. If you understand the hardware, you'll never be apprehensive about learning a new coding language. Pointers are not mysterious or scary if you've wrangled RAM addresses in CPU registers with assembly language.

@syscrusher „Pointers are not mysterious or scary if you've wrangled RAM addresses in CPU registers with assembly language.“ Absolutely! We started with C and I hated pointers and I didn’t really understood them until I had to use assembly. After that, it was like a revelation. Before that I knew how to use them but I didn’t understood them completely. The 8051 is rock solid and proofen. And I think modern variants have much more included into their package to simplify wiring.
@mmeese Indeed. I remember when programmable logic arrays (PLAs) first emerged, and then field programmable gate arrays (FPGAs), and now there are complex chips -- up to and including whole microprocessors like the 8051 -- that are available as libraries for custom system-on-chip (SoC) designs.
@syscrusher
Wow you were way more advanced than me in the dark ages with 4040! Hand-crafted machine code burned into EPROM. Oh, boy, if that doesn't betray my age...
@groundie The 4040, or did you mean the 4004? If you worked with the 4004 or the 8008, you were earlier than me, and I am seriously impressed. :)

@syscrusher 4040 came after 4004 with 24 spindly legs that could easily bend if you didn't line up just right!

I do NOT miss the old days...

@groundie Thanks for the info. I interned at Intel in the mid-1980s but don't recall the 4004 being mentioned.

I was there when the 80286 (or just the "286") was the hot new generation, the 8086 and 8088 were the bread-and-butter standards (excluding microcontrollers like the 8041, 8042, and 8051 series). The 80186 and little-known 80188 were compatible with the 8086 and 8088, respectively, but included core motherboard features like the interrupt controller on the processor chip.

That was also the time of Intel's 432 processor, a microcoded CPU that directly executed Smalltalk bytecode for object-oriented programming (OOP). The 432 was an interesting idea but failed massively in the marketplace because its microcoded architecture on a CPU core of that era was simply too slow to be viable for real applications.