Submitted without commentary: "AI Might Be Our Best Shot At Taking Back The Open Web" by Mike Masnick https://www.techdirt.com/2026/03/25/ai-might-be-our-best-shot-at-taking-back-the-open-web/
AI Might Be Our Best Shot At Taking Back The Open Web

I remember, pretty clearly, my excitement over the early World Wide Web. I had been on the internet for a year or two at that point, mostly using IRC, Usenet, and Gopher (along with email, naturall…

Techdirt

@cwebber I don't understand people claiming they can self-host an independent "AI agent". It doesn't square with how I understand they work. It doesn't square with the immense data centers being built.

And while at first I'm sympathetic to his complaint about losing the open Web, he goes off the rails. You can still create web pages. It's just text. HTML tags are optional. He's fantasizing about singlehandedly competing with Microsoft or something.

@foolishowl I disagree with much of his analysis, but he mentions Ollama for running local models. I’ve found it easy to setup as well, though any models you can run locally currently aren’t as “capable” as the cloud-hosted models. But the gap is narrowing for sure

@basetwojesus Honestly, this puzzles me.

If Ollama can be run entirely locally, and is almost as "capable" as the commercial services, then, what's with the colossal scale of the commercial services?

Most of the last five years I've had jobs in data centers that were building out HPC systems. We weren't allowed to know what the systems were doing, but our understanding was it was all about "AI" research. The compute and data storage capacity of a small data center dwarfs what a personal computer can do.

So if that kind of scale isn't actually necessary for "AI" (and I'm opposed to data center construction and "AI" anyway for many reasons), then what's all that capacity for?

@foolishowl I wouldn't say local models are almost as capable as the cloud models when comparing their upper limits. But many use cases don't really need all that juice. For example, if you have Ollama installed you can use a command like this to check a blog post for grammar:

`ollama run lfm2 "Check the following text for grammar and spelling, leaving the style and substance intact $(cat post.md)"`

For a code assistants, etc. though, cloud models are probably still necessary