I try to keep track of what’s going on among those who use LLMs for coding but they all keep linking Steve Yegge and I just can’t take anybody who links to Steve “gas town” Yegge seriously.
He’s the opposite of convincing.
I try to keep track of what’s going on among those who use LLMs for coding but they all keep linking Steve Yegge and I just can’t take anybody who links to Steve “gas town” Yegge seriously.
He’s the opposite of convincing.
@trisweb @baldur Honestly both of those are fine, if good to know when citing. Especially if the moral angle is something that's unsettled and people are casting about for better stances.
I'm starting to think though that a hidden piece of AI discourse is whether people can tolerate epistemically questionable sources well. Lots of people can't, it seems — and it explains why we’re in such a misinformation mess in this world even before LLMs, and now we're seeing the seams in places we used to be able to pretend weren't suspect.
I guess it should come as no surprise to me that people who were drawn to computers as deterministic objects would struggle with the absolute probabilistic buffoonery that LLMs generate, and that people have always had. And those of us drawn to them as communication objects in a time of a sometimes-hostile internet public have a different base sentiment about information. (And not to commit “of course I'm in the sweet spot”, but the people who grew up on the chan-pilled internet after my time, even far _too_ comfortable with hostile information spaces to the point of nihilism about it.)
@trisweb @baldur I actually disagree but with a caveat: I think commercial speech resembles LLM output, and for good reason. It's trying to be "normal", “centrist", "inoffensive” (politically correct), broadly appealing, and largely, ignorant of truth. It's quite often trying to create a marketing reality.
LLMs are just even better at this.
The incentives to produce this sort of text are still there.
@trisweb @aredridel @baldur this is part of my own aversion: that kind of understanding is pulling against the grain of the "magic wand" rhetoric, and a lot of the target market isn't particularly interested in understanding it even before that tension.
I don't see any way out of that trap, but I am also not in a rush to analyze the big picture since it's so clear that the status quo is an entire galaxy away from what equilibrium will look like
@SnoopJ @aredridel @baldur yes. Good way to describe the problem. We really are so far off of equilibrium culturally on the whole subject and its myriad downstream impacts (economic, social, political…).
Step one might just be a sane shared understanding of what it is and how it works, to dispel some of the mythology and half truths. Could also help explain those falsehoods clearly.
A lot of this has parallels in science communication and how difficult it is to combat disinformation there. And I think similarly the solution lies in good old fashioned marketing and ability to communicate and lead and treat people like reasonable humans as opposed to dumb enemies.
I’d like to see a foundation or benevolent project that could maybe take on that work…
🧵 Democracy feels like it's in a rough state at the moment across the globe, and we hear various explanations, like polarisation, extremism, disinformation, and loss of trust. But what if those explanations are mainly symptoms and we've been trying to treat them rather than the underlying causes? I...