The problem facing the tech industry is highlighted most clearly when you play old games. I just tried playing some of the Space Quest games in ScummVM (5, annoyingly, crashes with an assertion failure half way through). The first one shipped in 1986, the last in 1995, only nine years later.
There's enormous improvement between them. The first one was clunky even when I first saw it. I had 2 and 3 (from 1987 and 1989, respectively) on my 8086 with EGA graphics. They were close to the limits of what the hardware could do. Each was a noticeable step up in quality.
The jump in graphics quality in 5 was immense. Most of the 'the thing is there but I don't notice it because it's only a couple of pixels' annoyances were gone. The final one had even better graphics, video sequences, and now the dialog was spoken rather than just being text on the screen.
Compare that to modern games. I have an Xbox Game Pass subscription and it gives me games from the Xbox 360 to the Series X. The 360 was released 20 years ago. When I load a 360 game, I notice a bit, but a lot of them really feel just like modern games. The improvements in graphics have been incremental. The latest games use ray tracing, but the rasterisation modes are not that much worse.
And that's a big problem for an industry that wants to sell constant growth. If I upgraded from a computer that could play Space Quest 3 to one that could play 6 (8086 with 640 KiB of RAM and an EGA screen, to 486 SX with 8 MiB of RAM, an SVGA screen, and a CD-ROM drive), the improvements were immense. There was a clear reason to upgrade. Even once you'd reached market saturation with the old machine, you could still sell the upgrades because the newer hardware enabled things that were far beyond what the old ones could do.
The same applied in all other fields. But two things happen:
First, you hit diminishing returns. Sure, Xbox Series X games look nicer than Xbox One games. But not that much nicer. The same with business apps. Spreadsheets were mostly feature complete in the '90s (Improv was arguably far ahead of most things you can buy today). Final Cut Express 2.0 was released 22 years ago and did almost everything a non-destructive video editor should do (newer ones do more on the GPU and handle newer CODECs and have support for higher resolutions). Most of the time, technology reaches a 'good enough' level and it's hard to sell incremental improvements. This is why the industry loves things like always-on Internet, mobile phones, and reaches for things like AR, 'AI', and so on in the hope that they'll enable new use cases.
Second, organic demand growth is slow. You may be able to sell twice as much compute, or twice as much storage, but who wants to buy it? I think of this as the Oracle problem: the hardware you needed for a payroll and inventory database in the early '90s was an incredibly expensive server. Now a couple or Raspberry Pis will do it happily (with transparent fail over if one dies) for under $100. The requirements may grow at 5-20% per year, the cost of providing them shrinks faster, so the market is shrinking. And that's why the industry jumps on anything that lets them lock in customers.







