Intel details everything that could go wrong with US taking a 10% stake
Intel details everything that could go wrong with US taking a 10% stake
intel must still be hanging on purely based on corporate computers? or is there something else they are a large part of?
this just be in my bubble, but i feel like anyone i know over the last 15 years has been exclusively getting AMD, whether theyre tech savvy or just a regular consumer.
Their new GPU has a pretty solid price/performance.
CPU is shit though
I got a new work laptop recently. First one I've ever had that didn't have an Intel cpu. Company is a decent sized multinational.
I think it's already turning. But at the same time I don't think the US can afford to let Intel fail entirely.
the person above said:
anyone i know over the last 15 years has been exclusively getting AMD
that is 100% nonsense.
Oh I agree with you, but in my experience the people i know have predominately gone AMD as well. When I bought my 9900k, Reddit was HEAVILY downvoting any Intel support and upvoting AMD support. It doesn’t reflect the market, it I do see that in social trends.
…that said, while my 9900k still kicks ass, I am never going Intel again after recent news hahaha
Defense contracting.
They do a a good amount of of military industrial contracting and work for 3 letter agencies on data processing/ high performance computing.
They also got awarded government funding in 2024 to build logic chips for the military in-country.
Not enough to sustain the company, but such “sensitive” programs may not be allowed to show up in revenue reports or have to be assigned to other areas or so.
I’d buy a macbook, but it’s a lot more expensive than my “throw Linux on a used corporate thinkpad” approach, and I can tolerate macOS, but don’t love it. If you’re in the market for a new premium laptop, I think they’re pretty established, and I do think people are buying them.
Ampere workstations are cool, but in a price range where most customers are probably corporate, and they’ll mostly buy what they know works. I think their offerings are mostly niche for engineers who do dev work with stuff that will run on arm servers.
I’d say non-corporate arm adoption will grow when there’s more affordable new and used options from mainstream manufacturers. Most people won’t go for an expensive niche option, and probably don’t care about architecture. Most Apple machines probably sell because they’re Apple machines, not because of the chip inside.
I don’t know exact numbers, but I do feel that arm server adoption isn’t going to badly, especially with new web servers.
Probably applies to most used Laptops right now. Also, I have some thinkpad nostalgia, but the similar skus from other manufacturers will also do, though they put course have the same problem.
Generally, you of course always need to research the specific hardware. Also, my current one is on 8th gen, still does the job for now.
Enterprise ARM servers exist, I’ve used them, they’re neat.
With a proper stack you don’t even notice they’re arm
Modern times aren’t like the past.
Don’t get me wrong, the market will probably be worse if Intel were to go bust (certainly in the short term), but it wouldn’t be anywhere near as devastating as it would’ve been 10, 15, 20 years ago.
x86 isn’t the only viable architecture in town anymore.
Apple and others have proven that ARM is certainly viable for PCs.
Yes, Qualcomm’s X Elite was a complete dud, but that’s more on their/MS’s absolute shit show of driver/firmware/graphics API development, not on the hardware. Nvidia’s ARM stuff is already more mature.
Now imagine if Intel disappeared. AMD simply would not be able to meet the demand required, it’d tigger an arms race of companies pushing ARM and RISC-V development. Nvidia has not kept it secret that they want to get more into CPUs.
Shit, as unlikely as it initially seems, there’s so much money on the table that Apple could even consider selling SoCs (although even if they did, I imagine they’d retain the best for themselves, or charge a huge premium).