JulianCalaby

@juliancalaby@treehouse.systems
40 Followers
76 Following
1,053 Posts

Grumpy clown microdosing chaos. Petter of dogs. Hardware and software sadist. Applied specific curses to their own homelab for fun. Smashes code together for money. Solders together Arduino projects the wrong way. Has been in the presence of the One True Duck. Emigrating to the cloud the hard way.

I do not represent my employer here.

If you're contacting me and I don't know you, please tell me your favourite letter of the Greek alphabet within the first couple of sentences of our first conversation so I know you're serious.

Pronounshe/him
Codeberghttps://codeberg.org/SkUrRiEr
GitLabhttps://gitlab.com/SkUrRiEr
Githubhttps://github.com/SkUrRiEr
Hackaday.iohttps://hackaday.io/SkUrRiEr
i have a feeling that everyone who is arguing about LLMs in the sense of stuff like productivity, quality of the output, performance or functionality of generated code, largely even licensing or whatnot is kinda missing the point

even if LLMs generated the best code in the world, i would not be using them

even if LLMs gave me the biggest ever productivity boost i would not be using them

even if the output was clean copyright-wise and fully original, i would still not be using them

i can't consciously support a worldwide slop machine that helps and finances the rise of fascism, that helps further oppression, that helps a bunch of billionaires control public opinion, that drives society-wide psychosis with far-reaching consequences, that attempts to strip all joy from activities people like while using that to make the rich even richer, and that's not even getting to environmental stuff or whatever

i'm getting left behind you say

well fuck your industry and fuck you

1. YES THEY ARE.

They are vibe-coding mission-critical AWS modules. They are generating tech debt at scale. They don't THINK that that's what they're doing. Do you think most programmers conceive of their daily (non-LLM) activities as "putting in lots of bugs"? No, that is never what we say we're doing. Yet, we turn around, and there all the bugs are.

With LLMs, we can look at the mission-critical AWS modules and ask after the fact, were they vibe-coded? AWS says yes https://arstechnica.com/civis/threads/after-outages-amazon-to-make-senior-engineers-sign-off-on-ai-assisted-changes.1511983/

After outages, Amazon to make senior engineers sign off on AI-assisted changes

AWS has suffered at least two incidents linked to the use of AI coding assistants. See full article...

Ars OpenForum

AWS serverless: for when you want to pay far too much for two containers and an S3 bucket in a trenchcoat.

#tech #aws #cloud

hearing gullible 20-somethings say "this technology DOES have good use-cases, like in medicine for example…"

is going to turn me into the fucking Joker

When I started in security, one of the prevailing attitudes was "The weakest link in the chain will always be the human."

I would like to thank every LLM provider and startup for changing this paradigm by introducing a much weaker link in the chain.

Technology is not inevitable. We've decided not to have asbestos in our walls, lead in our pipes, or carginogenic chemicals in our food. (If you're going to argue that it's not everywhere, where would you rather live?) We could just not do LLMs. It's allowed.
Machine translations are often brought up as a gotcha whenever I criticize LLMs. It's worth pointing out two things: Machine translations existed decades before LLMs, and yes, machine translations are useful. However: I would never in my life read a machine translated book. Understanding what a social media post is talking about in rough terms? Sure. Literature? Absolutely not. Hell, have you ever seen machine translated subtitles? It's absolute garbage.

If George Floyd was the catalyzing event that finally radicalized me against police, AI is the event that has truly radicalized me against capitalism.

Before: "yeah it's bad and sucks and hurts us, but like...idk"
Me now: "jesus christ burn it to the ground, it is simply a parasite on even the ECONOMY let alone the people."

AI truly feels like a pinnacle of extraction of workers and environment. To turn the world into a theme park for the wealthy. It makes me think of a...post somewhere online that's like "If you want to live in a walkable city but all the people working at restaurants and coffee shops can't afford to live there, you're living in a theme park." It's what they want. Because service industry jobs WILL still exist, but everything that makes us human will be extracted and sold. I might feel differently if that money were, i don't know, given back to us. But it never would be, never could be.

And it is ever more painful because I nearly feel like I *must* use it or be fired in short time if it comes to light I haven't been. If I don't, I will be unable to pay rent and they will hire someone else (IF ANYBODY?) to extract more from. And it'll be me and many people (so many more qualified than me, at that) competing for the scraps of not-all-in-on AI companies of which there will be increasingly few. Where are my morals except given away to the dollar for survival. Or competing for trade schools or whatever husks remain.

From what I've observed, people who claim that LLMs can replace artists don't understand art, people who claim that they can replace musicians don't understand music, people who claim that they can replace writers don't understand literature, and people who claim they can replace translators don't rely on translations. If I had a button that would erase LLMs from the world but it would take machine translations away (which is a false dichotomy anyway), I would absolutely still press it.
@mcc It's telling that the one "big" success story in the years of slop has been computer programming, an industry where a lot of people have no professional standards and are allergic to solidarity.