a handful of years back, I briefly started messing around with the early generative AI models. I quickly concluded that they're more-or-less a novelty - a toy - and then grew bored and moved on.

then rich techbros started pushing that novelty toy on us in a million different ways, claiming that it could replace creatives and programmers, that it could act as our therapist and our support system, as our friend and romantic partner.

that toy infiltrated our work lives. it started sucking up energy and water. it made personal computing expensive as shit. it showed up on my phone and in my web browser without my consent. people who should be resistant to the allure of a glorified novelty started hyping it as essential to their process. people even started claiming the toy was sentient.

and the thing is, the toy hasn't got much better. it still produces garbage art, it still gets shit wrong all the time, and it still spits out text in the most bland, customer-service-y tone imaginable.

when I look at not just the people hyping "AI," but the people saying the equivalent of "AI sucks except for X and Y use cases," what I see them doing is taking a baby rattle and trying to convince me that it's actually a hammer, because both of them make noise when you strike something with them.

@YKantRachelRead every time someone claims that it's good for certain use cases it's lacking any evidence for that claim and usually there's already evidence to the contrary available or a very good logical case to be made why it shouldn't be used for that.
@elexia yep, the baseline argument I see them making is "it's better than nothing," which is kind of like saying a lie is better than no information at all because the lie is at least something

@YKantRachelRead @elexia The worst part is that BigTech has already invested so much money and effort into GenAI, that they cannot turn back from their ignorance anymore. They'd not only face the void financially, but mentally, too, as they'd have to admit that they dropped a nuclear bomb on humanity, making them suffer from cognitive disassociation.

If the market rejects GenAI, they'll have to enforce it by destroying human skill to the point where we actually believe it has a purpose.

@YKantRachelRead AI is going to be the digital equivalent of asbestos.