The threat is comfortable drift toward not understanding what you're doing

https://ergosphere.blog/posts/the-machines-are-fine/

The machines are fine. I'm worried about us.

On AI agents, grunt work, and the part of science that isn't replaceable.

The thing is, agents aren’t going away. So if Bob can do things with agents, he can do things.

I mourn the loss of working on intellectually stimulating programming problems, but that’s a part of my job that’s fading. I need to decide if the remaining work - understanding requirements, managing teams, what have you - is still enjoyable enough to continue.

To be honest, I’m looking at leaving software because the job has turned into a different sort of thing than what I signed up for.

So I think this article is partly right, Bob is not learning those skills which we used to require. But I think the market is going to stop valuing those skills, so it’s not really a _problem_, except for Bob’s own intellectual loss.

I don’t like it, but I’m trying to face up to it.

> The thing is, agents aren’t going away. So if Bob can do things with agents, he can do things.

"Being able to deliver using AI" wasn't the point of the article. If it was the point, your comment would make sense.

The point of the program referred to in the article is not to deliver results, but to deliver an Alice. Delivering a Bob is a failure of the program.

Whether you think that a Bob+AI delivers the same results is not relevant to the point of the article, because the goal is not to deliver the results, it's to deliver an Alice.

I am aware of that - I was adding something along the lines of: I don’t think people care if we deliver Alices any more.
People never cared about delivering Alices; they were an implementation detail. I think the article argues that they're still an important one, but one that isn't produced automatically anymore
The article is talking about science research in the context of astrophysics, not coding sweatshops.
I was also talking about producing researchers for academia.