@cwebber But is it that, though?
Or are people still looking at it from a reactionary point of view?
Are people using this technology in ways that it shouldn’t have been intended because it was sold to them by people who wanted to maximize profit by any means necessary.
This is like saying that HTML is the devel because Palantir uses it to code their websites.
It’s like saying the compass is evil because that’s how the slave ships navigated the Atlantic Ocean.
@majorlinux @cwebber I don't think so...
Some love AI and yes, I see people sing 'use AI against itself!' but frankly that still makes AI users dependent on it and less intelligent and critically-minded.
I just don't think you can build an activist rights-reclamation on the bones of grift, stolen rights and climate destruction.
@john Does it, though?
For some, you can’t lose what you didn’t have in the first place.
Now, I’m not saying (as I have mentioned before) that all of a sudden everyone is an artist. If you can’t draw, you can’t draw. Accept it!
But what I’m saying is that some people do have executive function or other cognitive function issues that have been solved through AI tools.
I use it to help take notes, document work I’ve done, and for reference of said notes.
@john I’ve used it to automate complex tasks that n8n or Ansible can’t really do reliably based on copious amounts of notes that I have.
There are tools out there that can help people and I feel that gets swept under the rug thanks to reactionary forces at play.
Nobody stops to think critically about the tools and what a future with them could look like that doesn’t involve the exploitative nature of capitalism.
That’s because we’re all too busy to stop and imagine a better world, period.
@Garonenur @john Or, and here me out, we create new tools that can leverage existing technologies that could help.
You make it seem like the exploitative way was the only way.
Again, this is where I say people lack imagination when it comes to seeing a society free of exploitation.
Just because OpenAI created something that destroys the planet doesn’t mean the people would have.
We have to build with intention, it profit.
@Garonenur So there aren’t case amounts of data that we have at our disposal that doesn’t have it?
We can’t carefully curate what’s being trained and how we train it?
Again, if you blame the tech and not the people behind it, you let them off the hook to exploit again while we are left with nothing because we chose to villainize the people who actually have solutions to get us out while holding those accountable.
@cwebber I think he has some good points but in the end it’s IMO a fallacious argument because he thinks that everyone wants to build websites and apps and whatever; when 99% of people will never care about creating anything.
The best way to encourage people to take back the web would be (I know I’m an optimist) to destroy the monopolies of big tech and work relentlessly on making programming languages easier to use.
Just like with art, the only barrier to learning how to program, is the learning part. Almost every language is free to use, so it doesn't cost anything except effort (and a basic computer, but then you need that for AI too.)
The “agentic AI” in this story, didn't do anything he couldn't have done himself.
AI advocates don't want to learn how to program, they want someone else to do it for them. For free, or a cheaply as possible...
@cwebber I really had my eyes opened while on PTO for the holidays, when I used Cursor to configure my Home Assistant platform.
Up until then I was struggling to understand the vast amount of yaml configuration needed.
I have lights, temp sensors, and appliances all hooked in, but the automations sucked. A few days of working through the auto option in Cursor, and I had everything set up the way I wanted and no cloud based megacorp can take that away from me.
@basetwojesus it's a programming IDE application, which then uses a sort of 'meta model' under the hood to try and accomplish your queries and tab completes in a cost efficient way using different models. Defaults are all 3rd party cloud models, but you can plug in your api keys for Claude or host your own.
You do have to subscribe to Cursor itself, and then there's presumably some kind of markup over the direct cost of the cloud model queries so they can make a profit.
@rickpelletier @basetwojesus Seems apropos...
They recently moved a bunch of models to MAX mode, causing heavy users at our company to gobble all their tokens in days.
Guess the low cost days are over, and AI vendors are going to start squeezing the addicts to pay for their addictions.
@hendric @basetwojesus I think the point Masnick is trying to make is that using these tools for personal projects is a way to free ourselves from Big Tech - whether you do it today with a double-cost but super easy tool like Cursor or with a home built custom thing.
Either way, we're heading back into a world where people can just build the software they want
@basetwojesus Yeah, maybe, but is it better to only have systems that we understand?
I'd rather have stuff that does what I want, even if I don't understand everything about how it works, than have nothing
@basetwojesus fair point. I think Masnick and I are both in the position of already knowing enough about the tech to describe what we want competently without worrying that the implementation will be magic spaghetti.
Right now, though, I think the vast majority of 'normal' people think they can't do anything because the barrier to entry is high.
With a 20 bucks sub to Cursor and some kind of computer to run it on, anyone can build themselves a task management app in a couple of days
@basetwojesus my very first real piece of software was built for the IT department at the University of Miami. It was a web page where we had hard coded the area restaurants we liked and categorized by type. Then, it had a little database.
The purpose was to randomly select the lunch choice for the department, with logic to prevent choosing the same type two days in a row, and weights for preferring some more generally popular types.
@basetwojesus it took me like two weeks because I was learning early early .net at the time.
Today, that app can be built by literally anyone in 15 minutes, and nobody has to use some Premium Yelp random picker thing filled with ads
@vv @cwebber what really gets me is the article author's response to this comment.
He gets defensive, makes it about him and his feelings. He seems to think that Derek's anger is because of some insinuation in the article that Derek himself supports AI -- instead of being horrified that the legacy of his art is being twisted to these ends.
The AI boosters take criticism of their beliefs as a personal attack. Because the idea that people have different beliefs about AI is unimaginable to them.
@cwebber I don't understand people claiming they can self-host an independent "AI agent". It doesn't square with how I understand they work. It doesn't square with the immense data centers being built.
And while at first I'm sympathetic to his complaint about losing the open Web, he goes off the rails. You can still create web pages. It's just text. HTML tags are optional. He's fantasizing about singlehandedly competing with Microsoft or something.
@basetwojesus Honestly, this puzzles me.
If Ollama can be run entirely locally, and is almost as "capable" as the commercial services, then, what's with the colossal scale of the commercial services?
Most of the last five years I've had jobs in data centers that were building out HPC systems. We weren't allowed to know what the systems were doing, but our understanding was it was all about "AI" research. The compute and data storage capacity of a small data center dwarfs what a personal computer can do.
So if that kind of scale isn't actually necessary for "AI" (and I'm opposed to data center construction and "AI" anyway for many reasons), then what's all that capacity for?
@foolishowl I wouldn't say local models are almost as capable as the cloud models when comparing their upper limits. But many use cases don't really need all that juice. For example, if you have Ollama installed you can use a command like this to check a blog post for grammar:
`ollama run lfm2 "Check the following text for grammar and spelling, leaving the style and substance intact $(cat post.md)"`
For a code assistants, etc. though, cloud models are probably still necessary
@vv @foolishowl @cwebber one of my favorite js packages is this, which brings inspection to mobile too!
On silly appending debug=true to any query string will pop it
Like, there are real difficulties with the open web. But, it's not that hard to self-host a website. An SBC or a used computer is less expensive than a computer capable of web hosting was in the 90s.
But, he's the Techdirt guy. He already has a website. So what's he complaining about? Not being able to create new, more complicated applications, apparently.
He's complaining about the loss of an open web the way some USonian writers complained about the closing of the frontier in the early 1900s. It's probably easier to go horseback riding in Wyoming now than it was then. But that's not what he's missing.
@foolishowl @cwebber
1) Inference costs are cheap compared to training. That's the industry jargon. Basically - building massive neural networks is hard and intensive. But actually doing the walk from a start to a next word prediction is cheap (relatively). Obviously smaller networks are cheaper to walk.
2) Most of the small models designed for running locally are very large models which are then pruned down so that they don't have as much. There's a bunch of ways to do that. But it means that people advocating for small local models either don't understand them, or are fine with the destructive large models being created.