HYPOTHESIS: while Moore's Law dominated performance in laptops, the rule was "cheap, fast, low power—pick any two".

Moore's Law is coming to an end. The Macbook Neo says "why choose?"

Nobody needs a laptop with a 40 hour battery life. Nor does anybody needs 200 cpu threads and an AI coprocessor and 256Gb of RAM and 8Tb of SSD. So we're finally seeing the sweet spot in the phase diagram drift inexorably towards the corner labelled "cheap".

@cstross it nails the “most people’s use cases” in a price point and feature set that’s really hard to argue with
@cstross ... an actual honest-to-goodness 3.5mm headphone jack ... weren't Apple the ones that started the obnoxious trend to eliminate those? totally makes sense for an item at this price point, recognising that not everyone can afford the $200 bluetooth headsets ...
@cstross hmmm ... I am tempted ... I had a 14in Macbook Pro back in the day and OSX/Darwin was a sweet OS to do things in ... my chuck-about console device is a horrendously underpowered Asus flip notebook/tablet thing, so a refresh on that has potential ...
@mherbert @cstross You’re thinking of phones. The laptops still have them.
@cstross It’s probably the first instance of what will turn out to become _A Laptop_ (no further qualifications necessary, because it does everything everybody expects and needs. Edge cases and niche applications need not apply.)
@cstross I believe it does have the AI coprocessor
@davidgerard It does, and I've got Apple Intelligence firmly switched off.
@cstross 8GB RAM definitely still feels like it could be a limiting factor, though. Although to be fair iOS handles it pretty well.

@Salty @cstross my experience is just comparing RAM sizes is misleading, just like comparing GHz etc.

They've highly optimized it - I'd say 8GB is enough for most "normal" use cases even though that sounds surprising. So, why pay the memory tax?

I think we've been marketed into believing we need lots of RAM (also indoctrinated into believing we do by history, edge use cases and the profligate nature of some OS environments).

I don't have a Neo. But, I have a MacBook Air M1 we got as freebie when Apple first released aarch64 Arm SoCs. That's the 8GB base spec.

I assumed it'd be a poor experience when I got it. But, it works absolutely fine with multiple browsers/tabs, libre office, untitled goose game, etc - all those things that probably constitute "normal" computer use. And that is a few generations ago.

Unsurprisingly it doesn't work fine for technical tasks like building large SW stacks or hosting VMs. But, that's a way smaller cohort's use case - outside Mastodon at least!

@markn @cstross Yes, iOS does well on 8GB (or less). I guess I'm still a little pleasantly surprised that full-blown MacOS does too!
@cstross I did wonder what Apple was going to do with the “our base CPU is more powerful than most people need it to be” problem besides render the UI through a VFX pipeline.
@cstross I'm curious what's going to happen now that 90% or more of computer users can do everything they want with a $500 laptop. That same level of machine would have struggled with 10 browser tabs just a minute ago

@ebooksyearn Yes. As it happens I have a ~$500 machine from 2 years ago. Intel N100 cpu, 12Gb RAM, same size SSD: runs Linux Mint nicely, but the flip side is the battery life is about 2h30m instead of 16h. A deal-breaker, that.

Apple *somehow* squared the circle.

@cstross @ebooksyearn interesting! My old n450 (I think it was ..) laptop (ok. Netbook) managed more than 8 hours easily, I used it every day on the commute, writing papers or code. Wouldn't work for my eyesight these days, though.
And I have been arguing that we have reached good enough for a while. My kids' second hand ThinkPad is not really worse than my newer one. Except for battery life due to wear.

@cstross
So, which business models are obsoleted now that compute is a commodity?

Is it maybe the folks that scream you need AI in everything, so that more datacenters need to be build? Cant allow people to be happy on decade old hardware because that is dampening demand.

@Sweetshark No, those people are scam artists, nothing more and nothing less. (Aside from the delusional sheep who're following them because they don't understand the basics of CS, much less the cognitive psychology hack that makes the tech-illiterate mistake a "chinese room" for a person.)
@cstross @Sweetshark
The Chinese Room was debunked 40 years ago, and I still get people quoting it at me. Not to speak of the people who don’t understand that Neural Nets are not anything like biological neurons. I get tired of explaining CogSci 100 (prerequisite for 101).
@SpeakerToManagers Chinese Rooms as a procedural system were Searle's attempt at refuting the idea of simulation on philosophical grounds. (He was wrong.) But chatbots with no underlying model of the world aren't conscious either.
@cstross
True. Searle assumed some godlike being carefully filled in the google or so entries in the lookup tables that controlled the way the little man (or was it a p-zombie? I get confused) inside the room created the translations. I am seriously annoyed by thought experiments that start with incoherent postulates.

@cstross

Got a cheap notebook from 'reward points' at work. I named it 'cromulence'; everything about it is (just) acceptable.

CPU is okay, screen is meh, battery life is good enough. RAM and storage were barely sufficient, but I was easily able to open it up and add RAM and a better NVME I had lying around. Of course I put Linux on it. (Those last three are not common, of course...)

That was before the Neo, which has much better specs - except I can't bump up the RAM or storage on it. 🤷

@cstross It reminds me of when people used to ask me whether to get more RAM or a faster processor and I said, "Buy the largest monitor you can afford, and you have any money left over, buy a computer."
@stevendbrewer @cstross That's basically what I did with this computer. Wanted a nice IPS monitor for photo editing, the computer itself was more or less an afterthought.

@cstross

I checked hardware prices for servers yesterday, and a 16 GB DDR5 RAM module had a purchase price of €1600.

How long will "cheap" remain an option under current market conditions?

@juergen_hubert @cstross, ouch.

The last RAM which I bought (2×8GB DDR4 3200) cost about £40, though that was 2½ years ago. If the increase in price were in line with inflation, it'd be somewhere around £43 to £45 – but no. From the same supplier, it now costs £145.

Let that popping sound be heard soon…

@juergen_hubert @cstross There’s always looting ai data centers as an option. Probably not a solo thing.

@su_liam @cstross

I recommend checking in advance whether it actually has servers or is just another financial investment shell that pretends activity.

@su_liam. Best idea ever. Loot data centers for ram for your punk compute projects. Should be a short story if not just reality. Or a standard salutation:

I’m off to loot data centers for ram for my punk compute projects. See ya

@abekonge It is cool. I’d still prioritize crowds of citizens cracking open the ICE concentration camps. A new American Bastille Day is better than cyberpunk looters. Not to say we shouldn’t do both.
@cstross outside of the Apple world we've had them for decades

@cstross Is this not just Apple doing what other laptop manufacturers have been doing for a while? My sprogs's laptops they use for school work and playing games are a higher spec than the Neo and were cheaper.

I mean it's great that you can now enter the walled garden of a massively overrated UX for cheap, but, like most Apple products, it's nothing new or better.

@cstross

Someone once said the Moore's law of software is that the number of CPU instructions it takes to add two numbers doubles every two years.

@Phosphenes
🤭
I like this. It's good to chuckle - even if the cause is inherently depressing.
@cstross
@phpete @Phosphenes I realized we were doomed NO LATER THAN the excited new feature announcement from Microsoft to the effect that Windows Vista would perform anti-aliasing on the cursor drop-shadow.

@cstross
Hey, some people actually WANT cloud compute powered pretty things with constantly spinning color wheels that can only open one ad filled tab at a time.

I don't know any of them personally, but I'm told they exist.

They probably just go to a different high school.

In Canada or something.

@Phosphenes

@phpete @Phosphenes GUIs have been going downhill ever since roughly MacOS 9, or maybe KDE 3.5. (Windows 98 is dead to me.)

@cstross
Careful now - accurate or not, the "get off my lawn / back in MY day" tone is rising.

We're gonna lose support from anyone who doesn't remember configuring hardware via jumpers and frankly they've got the numbers. We need them on our side! Quick, say something about avocados!

;-)

@Phosphenes

@phpete @Phosphenes No, really, I *like* skeuomorphism: Jony Ive can get in the fucking sea!

@cstross
Ok, but joking aside we've clearly tipped on some of the icons, right?

Save as a 3.5? I'm not suggesting these things change, but I've had to explain the concept of square disks twice this school year alone.

If that's staying, then clicking it had better create a local file. Save to cloud needs something deliberately 'other'.

@Phosphenes

@phpete @cstross

Yeah Windows makes saving to local more and more clicks away with every release. They're *really* pimping their online storage, and I don't want.

@phpete @cstross @Phosphenes I work with so many of them

People who genuinely believe sales pitches and hype; eagerly and proudly post copilot slop into teams

Really truly feel like companies have our best interests at heart

@auzdavenice

Agreed, we joke but many specific people in my life come to mind - I simply don't understand it.

@cstross @Phosphenes

@cstross @phpete @Phosphenes The bit that boggled me at the time was when it demanded a brand new SM3 GPU to do a bit of transparency and a 3D task switcher. Meanwhile Beryl/Compiz Fusion could do all manner of wild 3D desktop things on a 7 year old Geforce 2.
@Phosphenes @cstross and the number of frameworks which have a runtime of at least 1 MB each, which can be used to display hello world to the screen, doubles every two years as well

@cstross

The lack of RAM really will be a disadvantage in the next few years.

If they could have sold it with 16 or 24 GBytes then it would be an easy recommendation for a "good enough"¹ laptop for many people.

If Apple can't make that happen, then it's probably not economic, or they are engineering for rapid obsolescence and enhanced shareholder value.

The used market should be buoyant though as people upgrade to more balanced specification laptops.

Alternatively someone might come up with a replacement motherboard that fits in the Neo chassis as a drop in replacement 😉

¹ Good enough in the Jerry Pournelle sense of that phrase.