@cwebber I don't understand people claiming they can self-host an independent "AI agent". It doesn't square with how I understand they work. It doesn't square with the immense data centers being built.
And while at first I'm sympathetic to his complaint about losing the open Web, he goes off the rails. You can still create web pages. It's just text. HTML tags are optional. He's fantasizing about singlehandedly competing with Microsoft or something.
@foolishowl @cwebber
1) Inference costs are cheap compared to training. That's the industry jargon. Basically - building massive neural networks is hard and intensive. But actually doing the walk from a start to a next word prediction is cheap (relatively). Obviously smaller networks are cheaper to walk.
2) Most of the small models designed for running locally are very large models which are then pruned down so that they don't have as much. There's a bunch of ways to do that. But it means that people advocating for small local models either don't understand them, or are fine with the destructive large models being created.