Intel announced plans to start making GPUs, challenging NVIDIA's dominance
Intel announced plans to start making GPUs, challenging NVIDIA's dominance
It isn’t much of a challenge if they suck. Just planning to make them doesn’t mean shit.
Also, why do none of these articles have a summary posted for them? These are some seriously low effort posts.
Just what every consumer needs. More AI focused chips.
Intel just trying to cash in on the AI hype to buy the sinking ship, as far as investors are concerned.
focus on AI
Never mind guys, it’s a nothing burger
I was sure their focus was already on AI. Bought an Arc a770 when I first built my PC. It was alright, but the gaming aspect had a lot of flaws.
Each driver update had some improvements, but the bulk of it felt like AI bullshit.
TSMC is how they stay competitive; that’s what everyone else uses
Intel is still catching up with 18A
The 18A production node itself is designed to prove that Intel can not only create a compelling CPU architecture but also manufacture it internally on a technology node competitive with TSMC’s best offerings.
Well that article was a waste of space. Intel has already stepped into the GPU market with their ARC cards, so at the very least the article should contain a clarification on what the CEO meant.
And I see people shitting on the arc cards. The cards are not bad. Last time I checked the B580 had performance comparable to the 4060 for half the cost. The hardware is good, it’s simply meant for budget builds. And of course the drivers have been an issue, but drivers can be improved and last time I checked Intel is actually getting better with their drivers. It’s not perfect but we can’t expect perfect. Even the gold standard of drivers, Nvidia, has been slipping in the last year.
All is to say, I don’t understand the hate. Do we not want competition in the GPU space? Are we supposed to have Nvidia and AMD forever until AMD gives up because it becomes too expensive to compete with Nvidia? I’d like it to be someone else than Intel but as long as the price comes down I don’t care who brings it down.
And to be clear, if Intels new strategy is keeping the prices as they are I’m all for “fuck Intel”.
This is a big part of it, imo. They kissed the ring.
The other part of it is that, per the article, this is an “AI” pivot. This is not them making more consumer-oriented GPUs. Which is frustrating, because they absolutely could be a viable competitor in low-mid tier if they wanted to. But “AI” is (for now) much more lucrative. We’ll see how long that lasts.
Intel GPU support?
ZLUDA previously supported Intel GPUs, but not currently. It is possible to revive the Intel backend. The development team is focusing on high‑quality AMD GPU support and welcomes contributions.
Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.
OneAPI is decent, but apparently usually fairly cumbersome to work with and people prefer to write software in cuda as it’s the industry standard (and the standard in academia)
Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.
Good. So prices might actually be reasonable.
From what I’ve read about the “quality” of their drivers, .. NVidia isn’t under any threat, whatsoever.
Years before bugs get fixed, etc..
( Linux, not MS-Windows, but it’s Linux where the big compute gets done, so that’s relevant )
https://www.phoronix.com/review/llama-cpp-vulkan-eoy2025/5
for some relevant graphs: Intel isn’t a real competitor, & while they may work to change that .. that lag is SERIOUSLY bad, behind NVidia.
_ /\ _
Yes, it works out to a ton of power and money, but on the other hand, 2x the computation could be like a few percent better in results. so it’s often a thing of orders of magnitude, because that’s what is needed for a sufficiently noticeable difference in use.
basing things on theoretical tops is also not particularly equivalent to performance in actual use, it just gives a very general idea of a perfect workload.
Wut?
Alchemist and Battlemage cards were fine.
Edit: oh no. It’s a pivot to AI compute 🤦♂️
I don’t know shit about that poster, but that’s a very english phrase.
Also, I’m not going to look into their post history. Partially because I’m lazy, partially just to spite you.
Xoxo
“oh great, competition in a market with no competition. Horrible.”
Intel has already been making discrete GPUs for two generations and they are very cheap and aren’t the most performant but fantastic for the price.