how can something be so courageous and yet so true
how can something be so courageous and yet so true
I think you have been looking at Apple Benchmarks. M-series isn’t the fastest in Multi-Core now or when it was released, though it might have been fastest when the first Max chip came out, hard to tell. They had a small lead in Single Core at one point though I think. It might be faster than the latest AMD by a small lead depending which benchmarks you look at.
nanoreview.net/…/intel-core-i9-13900k-vs-apple-m3…
Ah okay! Thank you for providing sources. It still seems like a competitive chip, but I see I was wrong about it being outright better.
I think the first two comparisons are a little unfair (unless I’m mistaken) since those are both desktop spec chips and apple doesn’t realllyyy do desktops anymore.
The second one is a laptop chip, you maybe need to brush up on CPU naming. I actually meant to have a laptop chip for the first as well but I made an error. M3 Max is used in both desktop and laptops. Stop thinking of M series in terms of desktop or laptop, it’s meant to be a multi purpose chip. Even with Intel and AMD you see chips and dies getting reused between desktops and laptops.
Anyway here is one with an Intel laptop chip: nanoreview.net/…/intel-core-i9-14900hx-vs-apple-m…
Although if we are comparing desktop only chips then AMD have Threadripper Pro and Apple has M3 Ultra. I am pretty sure I know who comes out on top (hint: it’s the one with up to 96 cores)
Mac Pro? First Ultra class chip came in Mac Studio. They missed an opportunity to call it a Mac Pro Mini.
They’ve actually improved cooling a fair bit. The two chips I mentioned though are definitely power hungry though. That Intel chip can use up to 115W sustained. I don’t think I want to know what the burst power is. The larger MacBook Pro might actually be able to get close to that (I think Apple’s solution is on the order of 100W if you push both CPU and GPU simultaneously). Yeah the Apple system is considerably lower power once you start looking at GPUs as well.