i do not want to get into the business of posting LLM takes but very briefly:

It feels clear to me that some people* are getting value out of using LLMs for programming. Basically see https://simonwillison.net/'s whole blog. If I think about it purely on the basis of "in a vacuum, can this help me write programs", it seems like an exciting technology.

BUT...

(1/?)

(* it also feels clear that some people are NOT getting value out of LLMs, hoping to avoid flamewars about that please)

(continued from ^)

Google search doesn't work as well anymore because the results are full of LLM-generated articles? I hear about CEOs putting pressure on their teams to produce more faster because they've been told that AI will increase productivity?

it feels sad. even though I find LLMs useful sometimes, with all of the societal impacts it often feels like it isn't actually improving my life.

(2/?)

@b0rk I think Google’s problems have more to do with its ad business than the slop at the top, which is a term I think I just coined for when the top of the search results is LLM output.

@nick @b0rk Agreed. I have NUMEROUS issues with LLMs*, but Google has been getting worse for years before chatGPT. It has focused on general information over specific, to the point where it ignores the actual terms you search for in favor of more popular ones, presumably because more popular results equates to more and revenue.

"Slop at the top" is a great phrase.

* Anti LLM rant implied rather than given out of respect for b0rk's request.