There's a shift happening. OpenAI is shutting down Sora, presumably to focus on enterprise offerings. Walmart and Disney are cutting ties with OpenAI. Sam's about to get sued by Microsoft.

Conversely,

Nvidia's NemoClaw seems like a legitimate effort towards on-device AI. Apple's M5 chips are decked out with new AI technology, hinting that they might do something in the local AI space.

I think cloud-based AI is cooked. It's too expensive. The market is shifting. Or I'm high on my own supply.

I say this not as someone who is pro-local AI but someone who is against the cloud. And against this sort of mass power grab enacted by the hyperscalers over the past five years.

It appears like they're failing. And short of passing legislation to codify a monopoly of some sort, their vision of our digital lives are not coming to fruition.

@fromjason There's obviously an AI bubble that's going to burst at some point (I don't know when). But until local models at least as good as the current frontier models can run on a phone, we're going to have cloud AI. We are years away from that.

Cloud-based AI will always be cheaper, surely, because machines get shared in a fairly efficient way?

@john of for sure, cloud AI will always exists. What I think is failing at the moment is the strategy to kill the open source on-device market, and put AI behind a paywall. Or, to centralize computational power into data centers so that local AI models never even get a chance.

Now we have devs figuring out how to run a 400b parameter model on an iPhone 17 Pro. Congruently, we're learning that the biggest and most powerful models aren't always the best all things considered.

@fromjason Yes, local, open models are surprisingly good in comparison with frontier models, and in general this is playing out better than it could have, for sure.