Twice now I’ve experienced the fallout of bugs in my coworkers code and when I looked into it the bug was introduced by Copilot.

Think about that for a second.

I’m trying to accept that everyone I talk to at work about these systems (I won’t dignify them by using the term ā€œintelligenceā€) ignores my warnings and treats me like a fool for refusing to use them, but now I have to clean up the mess others make by trusting these things.

This isn’t sustainable.

@requiem It's vastly disappointing how many people (including here) misunderstand both the problems associated with AI and the capabilities of AI in of itself.

* The current capabilities of AI are over-hyped and over estimated. It's fancy pattern recognition. It is by no means intelligent.

* Corporations are abusing it to steal code, art, and thus get rid of jobs.

* AI output is error prone, and always worse than what a skilled human would produce, but bad quality has never stopped a corporation from cheaping out in order to profit.

It is a multiplier in the race to the bottom. Artists, writers, etc,... are all getting massively screwed by having derivatives of their work stolen while at the same time job offers for the more simpler tasks vanish. As if creative people needed another kick while down. And their customers are being screwed by getting worse products in the end.

We're not "scared" of AI because we think it might go skynet on us. It ain't that clever. It's problematic because it gives corporations another way to exploit us. On a massive scale.

And sorry, but "Should have reviewed the code" is a lame excuse. We all know it's harder and slower to properly and thoroughly review code than to write it from scratch, especially for the trivial stuff AI would be used for at this time.

By using AI you're feeding more data to the companies running them which they can assimilate into their models. By using the tools you are accelerating the problem and actively making the world worse.

There is just no reason and no excuse to use AI. Just don't.

@jns @requiem I'm curious whether you think there is any place for such systems? I agree that there are many problems in the theft of IP. And certainly the output of anything complex can be questionable at the moment. Though perhaps not more so than the average Google search brings up.. But as an interactive teaching system, that does a good job of recognising what you're saying/asking and producing some explained output. I think it has great potential. At least I've found value in it.

@makergeek @requiem I think that's asking the wrong question. One can always dream up use cases, but the question we should be asking is does the benefit we get out of those use cases weigh up against the potential (and real) harm.

For instance, I don't work on AI projects, not because I can't think of any use cases, but because it would be endorsing the current hype surrounding it. Companies who are currently using AI for profit rely on that hype in order to get more funding.

So the question is, ultimately, is what I'm doing, ultimately benefiting or harming society. And when it comes to using or developing AI projects, any time I make that balance, it heavily shifts towards harm.

@jns very well said.

This is the same reason I stopped designing weapons systems in middle school, because I don’t want a world with more ways to kill people, even if there’s a lot of money in it and the engineering is interesting.

@makergeek

@requiem @jns I think the moral case in weapons systems is somewhat more clear cut. But tools that can be used for good or ill are tricky. I personally don't fly because of the environmental impacts. But should flying be off limits for all usage? Lots of tools have been developed that did a great deal of harm and a great deal of good. Are we better off leaving the development of such tools to people that don't fret about the harms side of the equation?

@makergeek @requiem @jns I think yes it’s better to let it flame out on its own but I’m not sure.

ā€œThe greatest minds of my generation are working on spam filtersā€ - William Gibson

and then it was ad networks and then it was crypto, and now maybe it’s AI. I’ve had engineers quit to go join boiler room crypto ventures.

@makergeek @requiem @jns And it’s all still out there but much of it has ā€œfailedā€ and faded away. The technological shift however is slow and remains despite the cash grabs. We now have blockchain config managers, and data science enjoys what was made for spam and clicks. I don’t pretend that the money didn’t influence the technological progression but I don’t think I could’ve changed it’s direction with my involvement.

@reconbot I hope you’re right about it being another fad or whatever term we should use for these things.

@makergeek @jns