@sinister @buttplugio Only you can answer that, but I don't expect programming jobs to go away - a lot of trouble gets caused when nobody understands or cares about the code :)
Some open source projects are be plagued by users submitting AI-generated patches and security reports that are invalid, because the user didn't read, understand, or investigate what the AI spat out.
@fencepost @patrizia @sinister @buttplugio
But the 90% case is done in 10% of the time you spend on the code. IntelliSense and tab completion do already everything you need. AND you don't get into a situation where you're wasting hours to find a bug in a section of the code that you don't even know what it is supposed to be doing.
@fencepost @patrizia @sinister @buttplugio
Debugging is 90% of the time you are spending. The first raw program text usually is the fast part, unless you work really hard on optimisation problems - which is another weak point for AI.
AI is trained to only shows the most common text in the context given - neither the most elegant, nor the fastest, most robust nor most correct program.
@patrizia @sinister @buttplugio when I was a school child more than fifty years ago, we were advised by our careers advisors not to go into software, because within five years (then) computers would be so clever they would program themselves.
Reader, I did not follow this advice.
@sinister @patrizia @buttplugio
Well as a hobby it probably becomes quite useful when you can just exploit 1980s vulnerabilities in the software of some "fancy startup"...
@sinister right now, yes. While the LLM is relatively good with code not connected to a bigger codebase, it's bad if there is a codebase. And don't let me start with debugging. I think it's beneficial to be able to code-review the LLM generated thing, also in future.
++ The time will come when training data includes mostly LLM own generated stuff. So lets see how this will end.
@sinister @patrizia @buttplugio
If you want to be good at it and find it interesting, absolutely it's worthwhile. Hell, even pretending for a second that the AI shit was any good which, it's not
"Other people are so good at it, should I even bother trying?"
Uh, yeah, that's how you become one of those people you're looking up at!
Is there a value in cooking when you can just go the restaurant?
Or in sketching when you can just go to a museum?
Or in going outside when you can just look at a picture of it?
Yes. To all of these.
There's a value to skill. There's a value to learning skills. There's a value to finding the things you love and loving the things you find, and no technological "advance" or mass-produced sweatship-product will ever change that.
@sinister Programming isn't just "making it work", part of it is also having to deal with user stupidity and protecting against that.
AI doesn't quite know how stupid users can get.
@enoch_exe_inc i mean cobol programmers are probably pretty desensitized with regards to bad code and awful design decisions.
It's another spat between people who use AI and people who don't.
It's dumb. Tools are tools, use what works, don't use what doesn't. Be careful about any dogma.
Short and sweet(after writing this, what a lie), 'AI' is a buzz word for a new kind of search. LLM's have their ups and downs just like traditional search types (exact match, semantic, regex) and Google search. The search can also be reversed to produce instead of search.
Which is causing trouble. So are scams being put into advertisement on Google and Facebook and every business, so were email scams, and so were phone scams. New tech, new attacks. Same old story.
AI is a bit dangerous as a tool because it's not local unlike most tools historically, which means it's nature could change. Though historically we have depended upon external systems like pip for Python, or GitHub, or mdm etc, or even phone infrastructure. Most of those won't all of a sudden change unless you install the update. But you do see rumblings of a trust breakdown with things like GitHub. Where people no longer trust that their private code is actually private. However, 'AI' (which is a terrible name for it) is very powerful and definitely has many use cases. Many of which have not been discovered yet.
Because 'AI' is built on data found everywhere and anywhere, and because in some cases it is polluting data sources (in some cases this is a huge problem), people are not happy, they feel, perhaps correctly that they have been stolen from. Simultaneously, most of us now have a new powerful search tool. It's a difficult moral landscape. Is vibe coding actually a thing that's super common? I don't know. I hear people use the term more often as an attack on AI. So my guess is that in reality, it is more a memetic dis on those who use AI.
Learning code will always be useful. To learn to code is to learn the structure of systems. Systems have some common truths about how they evolve, and attempting to build complex software teaches you about traps that systems run into that cause them to fail as they grow or age. Find a thing you want to do, and try to do it. Language models can help, and you can ask questions about the code itself. You won't always get the answer, but we still have other tools like traditional search, and books, and YouTube videos, etc. Use all of them.
Oh, and other people. People are still the most powerful source of information. Find smart people.
@arina For every unit test that passes, you get n seconds of some vibe pattern. n can be made proportional to coverage % in some way.
For each failure committed, you get something zingy.
When VSCode or CI integration to vibrate for every passing test and stop for the next 60 minutes for failed runs. (And if the failure is in a section of the code that copilot wrote block for 120 minutes)
That was supposed to be (just) a joke. But of course someone already actually implemented it.
@buttplugio @Janeishly You know, I go to DEFCON every year. And I like to check in on the talks which are not recorded or listed in any program. And every year, I pop in at a random time, only to discover it's a talk about sex toys. None of the others are about sex toys, but I always seem to find that one talk.
(I did like the one about BT Ben-Wa Balls being used to count cards at blackjack...Not that it would do me any good, being unequipped for the task.)
See what all the buzz is about at our vibe coding training camp. Sign up today.
/this should totally be a thing.
@phf @buttplugio
I remember that spoof.
The Register may have written about it.
This is what I could find, from 2005:
"our previous gusset-moistening reports into technoticklers"
https://www.theregister.com/2005/11/23/ipod_accessory/
Ooh, available in 6 colours:
https://www.theregister.com/2005/10/28/bluetooth_device/
One sentence from a 2000 article leads to a link:
https://www.theregister.com/2000/08/30/voodoo_gets_12_000_volt/
Actual link coming in the next toot..
@phf @buttplugio
I FOUND IT!
"Fully Y2K compatible!
You can safely ignore all of the horror stories of your vibrators suddenly shutting off on January 1st, 2000. With the iBrator you can usher in the new millennium with multiple, quivering orgasms"
"Now, with the power of USB you can connect up to 127 different iBrators for a sweaty, gang-banging marathon!
Plus, the iBrator draws power directly from your USB port - no more recharging!"
https://web.archive.org/web/20000302131226/http://www.ibrator.com:80/ibrator.html
@buttplugio I thought vibe coding was like when you make an LLM do it for you?
Terminology usage does not compute, brain.exe has stopped responding.
Congratulations on your expanded reach! Or should I say "flared user base"?