Twice now I’ve experienced the fallout of bugs in my coworkers code and when I looked into it the bug was introduced by Copilot.

Think about that for a second.

I’m trying to accept that everyone I talk to at work about these systems (I won’t dignify them by using the term “intelligence”) ignores my warnings and treats me like a fool for refusing to use them, but now I have to clean up the mess others make by trusting these things.

This isn’t sustainable.

@requiem Using LLMs to generate code like this is just going to take all the fun out of programming and leave developers with having to troubleshoot even more code.

Troubleshooting someone else's code sucks, and it's even worse when you can't sit down with them and ask them what their thought process was, because they aren't a sentient being.

@faoluin This is one of my primary concerns. I've seen other trades and skills die through patterns like this, and in some cases they were replaced by something superior, but in as many (or more) they were replaced by something that's worse in every way except making a few people rich.

Programmers who rely on this will become worse programmers, and people who use the work will suffer. Those programs will then get consumed into these models and the results will get even worse.

Garbage in, garbage out.