✍🏻 New blog post! 📰

I've been brooding over the shit #NVIDIA has been pulling over the past couple of years, and I need to get this out of my system. Also I want something to point at when the question inevitably arises:

"Well, what's the problem?"

Oh, y'know, just *gestures wildly* EVERYTHING?!

https://blog.sebin-nyshkim.net/posts/nvidia-is-full-of-shit/

NVIDIA is full of shit

Since the disastrous launch of the RTX 50 series, NVIDIA has been unable to escape negative headlines: scalper bots are snatching GPUs away from consumers before official sales even begin, power connectors continue to melt, with no fix in sight, marketing is becoming increasingly deceptive, GPUs are missing processing units when they leave the factory, and the drivers, for which NVIDIA has always been praised, are currently falling apart. And to top it all off, NVIDIA is becoming increasingly insistent that media push a certain narrative when reporting on their hardware.

Sebin's Blog
Oops, sorry for that short breakage, forgot to git pull before publishing a fixed version of the page's CSS from another machine, so the blog post went missing for a few minutes x)

@SebinNyshkim Yikes.😬 That's....a lot more bullshit than I'd previously heard about Nvidia's dealings.

Any thoughts on a counter-movement?

Sounds like there's not much leverage on the consumer market, considering the vast sums fronted by AI / Crypto into their pocket, but I do wonder if game developers can be convinced not to lock themselves into proprietary tech.🤔

@GoodNewsGreyShoes I was told that NVIDIA just makes it really easy to implement their tech, whereas AMD is a crap shoot and most of the documentation seems incomplete. And they're just doing whatever NVIDIA is doing, but worse.

So as much as I'd like AMD to be the one to cut NVIDIA down a couple notches, it seems like NVIDIA does have the better tech stack that's easier to work with?

@SebinNyshkim
It's pretty much the new "nobody can see more than 30 FPS anyway" thing. And the "propagandist crowd" (read streamers) happily hype this trash up because moneys.
Damn it, we need another TB; someone who's going to relentlessly push back against this until they manage to build up the "critical mass". They don't even need to be the same kind of unapologetic like TB was.
But since you mentioned Intel ... even the giants can fall when their adversaries ryze (  ) from the ashes.
@rawenwolf who or what is TB?
@SebinNyshkim @rawenwolf Totalbiscuit, he was a youtuber.
@JigglyWyvern @SebinNyshkim
yup, Zee was quicker
He died of cancer in 2015 iirc and he was very vocal critic of bad business practices. Quite unapologetic and sometimes a bit abrasive too (though J. S. Sterling is harsher) yet he was able to look into the mirror and back down when he realised he went too far.

@rawenwolf @JigglyWyvern Sterling is a bit abrasive at points for me, even if the points made are 100% valid.

Steve from GamersNexus also is pretty done with NVIDIA at this point and I feel like the rest of the sphere is, too.

@SebinNyshkim @JigglyWyvern
Not surprising. Like, it's not that NV makes necessarily bad tech but they're cutting far too many corners. Hell, Intel would be completely dead in the water if they did this. Not that their "4 cores for masses, more only for the 'chosen elite' " was that far off.
@rawenwolf @JigglyWyvern If anything, the only thing left to make NVIDIA eat a big piece of humble pie is gonna be the "AI" bubble popping and dealing with otherworldly losses because nobody needs their GPU-shaped shovels anymore. Longing for that day…
@SebinNyshkim @JigglyWyvern
I feel like they'll need a much bigger slap than that. Like, when the crypto gold rush simmered down they got through just fine.
NV needs a massive kick in the nuts for them to wake up. Think of an entire generation of cards having so many issues that it'd make them impossible to sell, especially in the "pro" segment.

@rawenwolf @JigglyWyvern they only got through the transition fine like they did because the "AI" scam goes by the exact same playbook as the crypto scam, and both needed gratuitous amounts of GPU compute to sustain. They keep selling us on a future that's never gonna come.

And it checks out for NVIDIA: Ray tracing even after 7 years is not the runaway success NVIDIA made it out to be. Most people avoid it for the immense drop in FPS for visual fidelity that's barely noticable if you don't crank it up to max, by which point anything below high end is out of the question.

@SebinNyshkim @JigglyWyvern
I vaguely remember toying with RT back when I got my 2070 and being like "It looks neat but it's too early."
And yes, almost a decade later, very little progress was done and many various RT mods look like someone was trying to "polish dog's nuts" (saying in my language).
Oh well, at least Win11 pushing me back into penguin land has me in the market for a team red GPU. Now if only they got into a more tolerable price range.
@rawenwolf @JigglyWyvern all the RT mods so far look like garbage and show a complete disregard for the creative vision of the original work. The Portal RT mod looks especially bad and I'd much rather play it as it was originally intended. Hell, even Doom 2016 still looks and runs amazing and it doesn't use RT at all, so the selling point of RT in and of itself just evaporates for me.
@SebinNyshkim @JigglyWyvern
Pretty much
Being out of AAA or however many As are in that market now makes the hardware demands much lower and easy to avoid the overhyped bells and whistles.
@SebinNyshkim your blog is awesome! the style is very nice and the smartphone experience is very good!

@SebinNyshkim A friend linked this post to me, and I have a couple of corrections if you'll humor a rando:

1. the MSRP for the 5090 Astral (a top of the line card which is used for overclocking world records) is $2.8k, not $2k like the 5090 itself ( https://www.techpowerup.com/review/asus-geforce-rtx-5090-astral/ ); I can't find the euro MSRP but it's surely not €2.2k
2. NVENC isn't much of a moat anymore; quality isn't really better than Intel's or even AMD's alternatives now (see e.g., https://www.youtube.com/watch?v=kkf7q4L5xl8 ) (and Intel iGPUs have, surprisingly, been basically comparable in quality for a while even putting aside their dGPUs)

@SebinNyshkim wow, I had no idea how bad things had gotten. I knew about the lock-in, inflated prices and marking BS but it's so much worse than that 🫠.

Will keep buying AMD, and hope games continue to allow me to play them on that hardware

@SebinNyshkim Regarding Windows XP 64bit not being available "to consumers" as mentioned in footnote 6: Are you sure about that? I seem to remember it being available to anyone, but I can't find a source at the moment 100% confirming that (searching on my phone)

I see some release announcement talking about it being "generally available"
(See "Availability" section here: https://news.microsoft.com/source/2005/04/25/microsoft-raises-the-speed-limit-with-the-availability-of-64-bit-editions-of-windows-server-2003-and-windows-xp-professional/)

There was also the Far Cry 64bit patch, which wouldn't make much sense for a business-only OS IMO

Microsoft Raises the Speed Limit with the Availability of 64-Bit Editions of Windows Server 2003 and Windows XP Professional - Source

More than 400 industry partners pledge support for new platform during the Windows Hardware Engineering Conference.

Source
@staudey @SebinNyshkim I recall Unreal Tournament 2004 also had a x86-64 version designed for 64-bit XP systems, apparently
@starstorm_x1 @staudey I know it has a 64-bit version on Linux I could install from the DVD, which was the only piece of commercial software I owned that was 64-bit back then x)

@staudey Sorry, I only now saw that reply on my computer. My phone app didn't notify me of this reply, because my instance mutes the main Mastodon social instance.

Company press releases are always a bit mmmm…

They say things like "general availability" and "customers" but that's very vague in terms of how much of that really made it into consumers hands and how much of it was only shipped with business workstations for heavy duty stuff that really benefited from 64-bit.

Also, Vista was 1-2 years away from release so it probably wouldn't have made much sense to throw another XP release at the consumer market when they wanted to sell a new version of Windows.

@SebinNyshkim Seems like it was available both as a pre-installed option on PCs from certain manufacturers, as well as by exchanging your Windows XP Professional 32bit licence for a 64bit one

https://web.archive.org/web/20050828024647/http://www.microsoft.com/windowsxp/64bit/howtobuy/default.mspx

https://web.archive.org/web/20050707000740/http://www.microsoft.com/windowsxp/64bit/upgrade/default.mspx

But I'm not seeing anything about directly purchasing single licenses (though there are/were copies of it available on e.g. ebay)

How to get Windows XP Professional x64 Edition

Find out how you can get Windows XP Professional x64 Edition, the next generation of Windows XP 64-Bit Edition for technical workstations.

@SebinNyshkim

I guess I'm glad my NVIDIA graphics card is approaching 15 years old!

@SebinNyshkim I knew something was up when I realized I still have an RTX 3060 and I don't have any need to upgrade it.
Raytracing? I don't use it because it kills framerate.
DLSS: I don't need it because I don't use Raytracing.
If you don't use Raytracing and you have a 2K display there is no point in getting a new card if you only use it to play games.

@meluzzy that seems to be the consensus. And honestly, the severe dip in framerate wasn't justified for the minuscule gains in visual fidelity back then, and it sure as hell isn't now with the addition of path tracing—which is even more computationally expensive, so any gains in computing power are gone again.

It makes zero sense if all ray tracing was supposed to be is game developers not having to bake lighting into scenes.

@SebinNyshkim As someone who has programmed a CPU raytracer from scratch. I can tell you this:
The problem is an intrinsically mathematical one. Finding the point of intersection of a ray and a triangle is expensive. if you have a 1080p display you have 2073600 rays (one per pixel).
If your scene has 2k triangles (which is not that many by today's standards), it means you need to check 4147200000 intersections, and those intersections will, very likely, bounce off, and those bounces will create another set of rays that will also perform another 4147200000 checks until their strength (depending on what they hit) decays enough or after a fixed limit.
This has to be done EVERY FRAME.
Now, there are ways of optimizing this.
Some rays are more important than others, you can apply space partitioning, you can do some statistical predictions based on that, many of which are now natively supported by GPUs.

Now, pathtracing: Imagine that instead of having one ray per pixel, you have 20 rays for each pixel, each at a slightly different angle...
Thats a LOT of intersections
It is several orders of magnitude harder to compute than raytracing

There are 2 realistic ways of pulling it off:

- You can have a computer with 6 graphics cards trying to reach 30fps on a 720p screen

or

- You can make games that don't use triangles. One of the biggest problems with ray tracing is triangles, if a different primitive was used for 3D models, the instersection calculations wouldn't be so complex.

@SebinNyshkim this is an excellent read. For anyone in the #gaming community. Or even #linux

@SebinNyshkim Another issue with the 12VHPWR connector you didn't mention:

The thing is rated for a current of 9.5 amps per pin. 100 W at 12 V is 8.33... A. That is a factor of safety of only 1.14, which would be cutting it close even in aerospace, let alone a consumer product! It means that if a single pin is not connected properly, or one of the wires breaks in the cable the other five will be overloaded, even if just slightly.

@zuthal Thank you for the insight! I'm not experienced in electrical engineering, so I'm dependent on outside info. The videos I linked go into greater detail, iirc, which is why I linked them 😊
@SebinNyshkim To this day, it baffles me as to how Nvidia was able to get away with proprietary vendor lock-in, and I hate how some software I encounter at work NEEDS to use their software stack just for it to work. Makes getting computer stuff hard. (Not helped by the fact that aside from AMD (and maybe Intel), all the competition has basically disappeared. And the fact that the low-end and lower-mid-range is slowly disappearing, and it feels like cards are getting less and less efficient and a lot bigger too...)
@starstorm_x1 that's exactly my problem with the entire situation as well. I'm being told that the NVIDIA tech stack is easier to implement, deploy and work with, while AMD's implementation is lackluster as is the documentation. Which is what hurts AMD's adoption in this space.
@SebinNyshkim I have to honestly say, that this was super informative and an interesting read! And accessible to someone like me, who only knows about this from a casual gaming perspective.
Also means I defo will not upgrade my rig again but instead build a new one when the time comes, without NVIDIA.

@midoriko Thank you! This was also my intention, to make it digestible even for people who aren't as deep into the tech to understand how this is a bad thing and how NVIDIA winning the GPU race is not a win for the consumer.

I mean, looking at the prices it's hard to overlook, but I think it's equally important to understand how we got here, and I personally believe that pushing ray tracing and DLSS is key to NVIDIA's strategy to lock competition out while locking their customer base in. It's the oldest trick in the book but understanding how they employ it is key to not fall victim to the practice more than people already have.

@SebinNyshkim You definitely achieved that intention on my end :)