Is this why modern software feels like garbage?

For 20 years it was reasonable to expect computers to be twice as powerful ever two years.

So we built things that sort of worked on modern machines, but would work really well in two years.

And then growth slowed.

@ajroach42 I feel like that's not quite the full story.

This graph shows you what the high end was, and it shows you transistor density, not performance.

But what about the low end?

On the low end, in 1978, when that graph started, the average home computer had a 1 MHz 6502 or a 1.8 MHz Z80.

Move 15 years later, to 1992.

The bottom end of the *NEW* home computer market had a 1-2 MHz 6502 or a 3.5 MHz Z80, and there was a massive install base of similarly-performing computers.

@ajroach42 Even in the PC world, a 4.77 MHz 8088 runs about as fast as a ~1 to 1.2 MHz 6502 in real world code. It had a lot more memory, of course, but that's beside the point.

A lot of real-world applications needed to run acceptably on that hardware in 1992.

Sure, there was a lot of stuff that barely ran on a 486DX2 66, the fastest x86 was in 1992, roughly 70x faster than that 4.77 MHz 8088, but DOS applications were expected to run on that 8088 unless they had a damn good reason not to.

@ajroach42 Note that Windows 3.x, from 1990-1992, started to rewrite what a bottom end IBM PC compatible meant.

In 1990, a Turbo XT clone - usually 8-10 MHz - was a perfectly acceptable low-end machine.

By 1992, you really needed a 16 MHz 386SX, because that Turbo XT could not reasonably run Windows 3.0, and couldn't run Windows 3.1 at all. Even a 286 had memory management issues.

Then, Windows 95 really wanted a 486 or better, even though it supported the 386.

@ajroach42 With the move to Windows, the bottom end of the market was forced onto the upgrade treadmill.

(Outside of IBM PCs... the people clinging to 8-bit platforms started having to jump around this point. Apple II users had the Macintosh as an option, Acorn users had the Archimedes and RiscPC as a very natural option, but everyone else's 32-bit platforms had died and everyone jumped to the PC or clones.)

@ajroach42 In any case, the move to multitasking GUIs meant that some performance tricks that applications used in the past no longer worked, and the increased complexity of having to learn new ways of doing things meant that optimization took a back seat to figuring out how to do things in a GUI. And, the increased minimum requirements just to run the GUI meant that you had more hardware anyway, so who cares.

@ajroach42 And then, the Internet happened.

Keep in mind that personal computers were certainly a *thing* into the mid 1990s, but they weren't universal.

The Internet was the killer app for personal computers. It made them universal.

So now, you had a massive install base of computers that were newly sold in the late 1990s.

In 5 years, your baseline performance moves from a decade old 4.77 MHz 8088 to a brand new 166 MHz Cyrix MediaGX.

@bhtooefr that’s a good point.

Prodigy and AOL were the reason we got our first PC.