i do not want to get into the business of posting LLM takes but very briefly:

It feels clear to me that some people* are getting value out of using LLMs for programming. Basically see https://simonwillison.net/'s whole blog. If I think about it purely on the basis of "in a vacuum, can this help me write programs", it seems like an exciting technology.

BUT...

(1/?)

(* it also feels clear that some people are NOT getting value out of LLMs, hoping to avoid flamewars about that please)

(continued from ^)

Google search doesn't work as well anymore because the results are full of LLM-generated articles? I hear about CEOs putting pressure on their teams to produce more faster because they've been told that AI will increase productivity?

it feels sad. even though I find LLMs useful sometimes, with all of the societal impacts it often feels like it isn't actually improving my life.

(2/?)

@b0rk one thought that came to me when Cursor started seeming to be good enough a lot of engineers started using it was that LLM-assisted coding makes some sense since it's putting the burden of using that tech on coders. AI-based summaries are shoving it in the face of the general public, who don't have as much of a handle on what its limitations are. And just seems to be the latest in the general enshittification of search engines. Social media is especially vulnerable to AI-bots.