Theory (which is mine) (and what it is, too): Enshittification has led directly to acceptance of LLMs, because the public is already used to software that is unfit for purpose.

@pluralistic

@wendynather @pluralistic
Yep. Search engines too: people used to search for an answer, now they ask an LLM. They didn't start doing it because they wanted a machine to think for them, they started because they stopped getting useful search results.
@TheGreatLlama @wendynather @pluralistic
They started because they were too stupid to switch off of Google to something else that was actually reliable.
@mtconleyuk What are these actually reliable alternatives? Because everything I've tried so far has been just proxying Google or Bing results, or has its own search index that's just as unusable, just in different ways
@das_robin @thegreatlama @wendynather @pluralistic
I've been using Kagi for a couple of years now and am satisfied with the results. They do offer AI slop, but I can disable all of it, and that's fine. Price is reasonable, about $5/month for my number of queries (there are cheaper tiers).
@mtconleyuk True, Kagi was pretty good when I gave it a try. But the owner's general conduct and stance on privacy specifically really made me not want to keep paying money for that, unfortunately
@mtconleyuk But also, I wanna add, saying everyone who doesn't switch away from Google is stupid and then offering a paid option as an alternative seems a little insulting to people who can't afford to easily shell out an additional couple of bucks per month
@das_robin I can understand that, and sympathise. I would probably be on DDG if I couldn’t pay for Kagi. I just think that most people never even think about what it is Google do to them and their data, and are sufficiently incurious to never explore any alternatives.
@mtconleyuk Fair. But even then I'd hesitate calling them "stupid". I'd assume most of them are simply uninformed and/or don't have the necessary mental bandwidth and time to care about understanding the issue, let alone look for solutions, next to all the other shit they already have to deal with in their lives. My point is, empathizing will usually go much further than antagonizing :)
@das_robin
Well, fair enough.
@mtconleyuk @wendynather @pluralistic
I mean... That's a nice thought, but I regularly cycle through every option on the market. None of them work remotely as well as ca. 2012 Google. LLMs aren't a useful alternative, but there aren't any good options.
@TheGreatLlama @mtconleyuk @wendynather @pluralistic It is staggeringly rare to find a search engine that will search exclusively for the literal exact term you entered. This certainly used to be the default.

@DamonWakes @TheGreatLlama @mtconleyuk @wendynather @pluralistic

Yeah it's on purpose to force you to use LLM which nobody wants

However there are some topics where it's helped me find research papers. Cases where it's hard to pin down a good set of terms that are very generic

@crowdotblack @TheGreatLlama @mtconleyuk @wendynather @pluralistic It's been going on long enough that it can't have started for the sake of LLMs. I suspect it's just accommodating sloppy searches at the expense of precise ones - not unwelcome where I don't know an exact term, but hugely frustrating when I do (and none of the results contain it).
@DamonWakes @crowdotblack @TheGreatLlama @mtconleyuk @wendynather @pluralistic Are there any addons/plugins/monkey scripts recommended for filtering? Not looked properly - figured if there was something there would be lots of coverage. Seems like something mechanical previewing and filtering results would do the trick and remove the need for AI scanning the results
@Moray @DamonWakes @crowdotblack @mtconleyuk @wendynather @pluralistic
I would doubt it: the whole intent with how they're serving up the garbage is that it isn't easily distinguished from useful content. The signal to noise ratio is just too low.