Nvidia delivers first Vera Rubin AI GPU samples to customers — 88-core Vera CPU paired with Rubin GPUs with 288 GB of HBM4 memory apiece
Nvidia delivers first Vera Rubin AI GPU samples to customers — 88-core Vera CPU paired with Rubin GPUs with 288 GB of HBM4 memory apiece
288 GB HBM4 memory
jfc…
Looking at the spec’s… fucking hell these things probably cost over 100k.
I wonder if we’ll see a generational performance leap with LLM’s scaling to this much memory.
Yeah they’re going to cost as much as a house.
I think we’ll see much larger active portions of larger MOEs, and larger context windows, which would be useful.
The non LLM models I run would benefit a lot from this, but I don’t know of I’ll ever be able to justify the cost of how much they’ll be.