i do not want to get into the business of posting LLM takes but very briefly:

It feels clear to me that some people* are getting value out of using LLMs for programming. Basically see https://simonwillison.net/'s whole blog. If I think about it purely on the basis of "in a vacuum, can this help me write programs", it seems like an exciting technology.

BUT...

(1/?)

(* it also feels clear that some people are NOT getting value out of LLMs, hoping to avoid flamewars about that please)

(continued from ^)

Google search doesn't work as well anymore because the results are full of LLM-generated articles? I hear about CEOs putting pressure on their teams to produce more faster because they've been told that AI will increase productivity?

it feels sad. even though I find LLMs useful sometimes, with all of the societal impacts it often feels like it isn't actually improving my life.

(2/?)

@b0rk About searching getting increasingly worse, there's another side of this I've thought was interesting.

A non-negligible amount of people no longer ask their technical questions on public forums, they ask their favorite chatbot. These questions, and their answers, are not publicly displayed for other people sharing similar struggles to search for.

@karl @b0rk one of the particular benefits that Stack Overflow had was not the initial answers, but the commentary, refinement, and background references added to the question over time, including "back in 2014, this was the best answer. In 2019, <new tool> generally replaces this solution with.... Support for this started with $version".

(e.g. ECMA modules and general browser advances, packaging tools in other libraries, standards advancement, library evolution, etc)