Every GPU That Mattered

49 graphics cards. 30 years. From Quake to Cyberpunk.

Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.

And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).

Hey, I could have used that i7-4790k!

I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.

It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.

I'm still rocking a Z97, i7-4790k and a 980Ti :) I'm still waiting until I need an upgrade. DDR3 is still performing good enough for the games I run.
Same. Still play StarCraft2 on a 4790k and AMD R9 Fury X.
I was running a 970ti for the longest time, it was only when I wanted to get into some VR gaming that it was time for an upgrade.
I also have that exact setup sitting around, but am just using my ryzen laptop now.
I used my 1080 Ti for about eight years. The successor GPU is in some ways way faster (raytracing, AI features etc.), but in others really quite stagnant considering the huge stretch of time that passed between them. ~10 years for 2-3x performance in GPUs at higher nominal and real price points shows how slow silicon advances have been compared to the 90s and 2000s. The same period from 2000 to 2010 would've seen 1000x performance if not more. The difference between a 1080 Ti and a more expensive RTX 50 card is the RTX can render ideally triple the frames in synthetic benchmarks, double the frames in some rasterizing games (most games won't see gains that high), and do a few relatively tame raytracing tricks at performance which is still not really good. At the same throughput it consumes maybe half the power or a bit less. The difference between a GeForce 2 and e.g a Radeon HD 4k is several planes of existence.
A lot of GPUs in this list are basically just previous GPU but faster or more RAM. I kind of thought it was going to focus on interesting new architecture innovations.
like the PS3? seems like everything is using PC architecture now. it does have RDNA.
I think pairing RX 5700 XT with Control as the "defining game" is an interesting choice, considering the facts 1. AMD cards were incapable of RT at the time and 2. Control was basically the first game with a good, comprehensive RT implementation that had a massive positive impact on the graphics.

> massive positive impacts on graphics

I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.

Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.

Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).

The other elephant in the room is the consoles, and even if they're capable of RT they also have to consider the performance capabilities versus visual payoff. As I see it the PC versions of games like Control from studios like Remedy are trailblazers, it's an early implementation (geforce 20 released in 2018, Control was 2019) as the ultra option to shakedown their implementation and start iteration early so future games will benefit, however the baseline is non-RT.

Honorable mention, the Rendition Vérité 1000 https://fabiensanglard.net/vquake/index.html

Released before the Voodoo 1 with glquake and gl support for Tomb Raider.

The story of the Rendition Vérité 1000

Agreed, those early manufacturers/models that experimented more feels more relevant than the more incremental listings of multiple 2000 3000 and 4000 series NVidia GPU's.
Very interesting culture difference between rendition and 3dfx in their chip design approach..

It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.

At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.

The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?
Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.
Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.

Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.

https://en.wikipedia.org/wiki/S3_Texture_Compression

S3 Texture Compression - Wikipedia

+1 to that, when i first saw unreal tournament with the add-on compressed texture pack was a real WOW moment.
G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's
This is an ad from viral marketing company and everyone here is falling for it.