Intel's $249 Arc B580 is the GPU we've begged for since the pandemic | PCWorld

https://lemmy.dbzer0.com/post/32651941

Intel's $249 Arc B580 is the GPU we've begged for since the pandemic | PCWorld - Divisions by zero

If even half of Intel’s claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

If they double up the VRAM with a 24GB clamshell card, this would be a great “self hosted LLM” machine.

3060, 3090 prices have been rising like crazy because Nvidia is vram gouging and AMD inexplicably refuses to compete. 16GB on the A770 is kinda meager, but 24GB is the point where you can fit the Qwen 2.5 models that are starting to perform like the big corporate API ones.

I always wondered who they were making those mid- and low-end cards with a ridiculous amount of VRAM for… It was you.

All this time I thought they were scam cards to fool people who believe that bigger number always = better.

Yeah, AMD and Intel should be running high VRAM SKUs for hobbyists. I doubt it’ll cost them that much to double the RAM, and they could mark them up a bit.

I’d buy the B580 if it had 24GB RAM, at 12GB, I’ll probably give it a pass because my 6650 XT is still fine.

Don’t you need nvidia cards to run ai stuff?

Nah, ollama works w/ AMD just fine, just need a model w/ enough VRAM.

I’m guessing someone would get Intel to work as well if they had enough VRAM.

ollama/docs/gpu.md at main · ollama/ollama

Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models. - ollama/ollama

GitHub