My experience with generative-AI has been that, at its very best, it is subtly wrong in ways that only an expert in the relevant subject would recognise. So I don't worry about us creating super-intelligent AI, I worry about us allowing that expertise to atrophy through laziness and greed. I refuse to use LLMs not because I'm scared of how clever they are, but because I do not wish to become stupider.
I will say one thing for generative AI: since these tools function by remixing/translating existing information, that vibe programming is so popular demonstrates a colossal failure on the part of our industry in not making this stuff easier. If a giant ball of statistics can mostly knock up a working app in minutes, this shows not that gen-AI is insanely clever, but that most of the work in making an app has always been stupid. We have gatekeeped programming behind vast walls of nonsense.
We seem to have largely stopped innovating on trying to lower barriers to programming in favour of creating endless new frameworks and libraries for a vanishingly small number of near-identical languages. It is the mid-2020s and people are wringing their hands over Rust as if it was some inexplicable new thing rather than a C-derivative that incorporates decades old type theory. You know what I consider to be genuinely ground-breaking programming tools? VisiCalc, HyperCard and Scratch.
You know what? HyperCard was a glorious moment in time that I dearly miss: an army of non-experts were bashing together and sharing weird and wonderful stacks that were part 'zine, part adventure game and part database. Instead of laughing at vibe-coders, maybe we should ask ourselves why the current state-of-the-art in beginner-friendly programming tools is a planet-boiling roulette wheel.
On the gripping hand, if you're a trained programmer using vibe-coding because of a perceived increase in your productivity, or pressure from management to increase your productivity, I would refer you to my first post in this threadโ€ฆ
I've seen lots of posts in the last couple of days about how quickly one can write lots of code with AI. I feel perplexed by this as I hate large programs. The largest thing I have written in the last decade is Flitter. It's only 30k lines and I believe very strongly that it is. Still. Too. Big. Even there, I wrote it purposely to allow the stuff I write *in* it to be very smol: mostly no more than 100 lines. That is the maximum I want to write in a day.
To me, all these people crowing about having written 10k lines of code in a day are idiots. If you need to write that much code in a day, you are manifestly working at the wrong level of abstraction to solve your problem.
@jonathanhogg
We need more done in actually high level languages.

@kirtai @jonathanhogg A large number of those lines of code are probably boilerplate indicative of abysmal library quality.

If a language's entire ecosystem is built on boilerplate and it's seen as normal, that is not okay.

Normalization of deviance

@lispi314 @jonathanhogg @kirtai I mean we didn't get here unintentionally. It's expensive and self sacrificial to invest a lot of time in infrastructure like programming languages and libraries (wherever you land in the stack) and make it very simple to solve the problems you are solving (and others to solve them too). It's also really hard, because not only is the problem you are solving an engineering problem but now also building the infrastructure to solve the problem. And the limiting factor is exploration not sitting down and hypothesising, researching, and cooking up some crap. Which yes is also part of the process too.
@lispi314 @jonathanhogg @kirtai you have to buy into an ideology of doing things this way. And if you're investing a lot of time in e.g. libraries for a given programming language, we already kind of know that our programming languages kinda suck and better be made irrelevant, so it also feels inefficient even when you do buy into it.
@jonathanhogg @kirtai @lispi314 and as someone who's spent a tonne of time investing in a certain stack to make things easier for other people to solve the same problem... things can wind up esoteric and difficult to onboard with (like imagine APL). It would unironically be quicker for them to use boilerplate even if long term that may not be true.
@jonathanhogg yep. And if they're working on an operating system (or any related system software, or anything that needs to stay up and running), they're committing malpractice that's going to get a lot of people killed:
https://mastodon.social/@JamesWidman/116133223470110717

@JamesWidman @jonathanhogg

Skynet didn't destroy the world by getting too smart-- it actually just started glitching and chasing its own tail in gibbering circles and everything broke.

I mean, it invented time travel, so gibbering circles were pretty much inevitable. As I understood it, the question was not how to destroy the world or eliminate humanity, but how to do so in a way that fails due to time travel, but ends up with the next iteration of Skynet being just a little bit more effective. It was like... playing the villain to motivate the humans to improve it, as the only way to solve the problems it was presented with.

And I mean, if you destroy the world and kill (almost) all humans, then change history so it didn't happen, then it didn't happen! Right?

Like that one Wakfu villain, except it was actually working.

CC: @[email protected] @[email protected]
@jonathanhogg In general I agree but the current state of everything-in-React means a 1k line change often makes sense at some level. It's horribly verbose, logically small changes make big diffs, the library ecosystem is full of half-baked projects, it's just a mess. Removing some of the friction to adding code makes it much worse, but it's not the only cause. Now, back to better dev tools, unfortunately Excel has somehow ended up being the go-to for a lot of the world. I dunno.
@mirth It's truly amazing what browsers are able to do now, but unfortunately that doesn't fix that JavaScript and the entire JavaScript ecosystem is a godawful mess
@jonathanhogg Browser engines are amazing. What we have is related to why Flash was never going to work for mobile. Nothing to do with the runtime, everything to do with the target most devs built for (desktop which 15 years ago was much faster than phones). Now half of web devs have Apple M series silicon which is twice or more as fast as most of the installed base of PCs. Whatever is just tolerably fast on a dev's machine is guaranteed to suck everywhere else. Sadness ensues.

@jonathanhogg

Heap of 10K lines and they probably have NO idea what is even going on in there.

@jonathanhogg thank you for this thread!

In the last years while the AI hype unfolded, I was lucky to get a closer view of Scratch, Snap and MIT App Inventor.

The ease of use, the speed of development and the abstraction of complex concepts into easy to use building blocks of the latter three were amazing.

Ever since AI came up my brain couldn't stop thinking that if so much code gets generated then we've been working at the wrong abstraction level all the time.

@mainec I teach block-based languages to grownups all the time and I just wish they werenโ€™t viewed as โ€œtoysโ€
@jonathanhogg though watching the team behind them tinker with these tools at FOSDEM sure made the the block based tools look like as much fun as toys. Essentially bringing the joy and ease back to technology.
@jonathanhogg @mainec

Is there a blocks-based interface for working with Arduino /ESP specifically, that you are aware of? Event driven programming is so powerful yet so impenetrable for beginners.
@the_moth @mainec Microsoft Makecode supports a range of hardware โ€“ canโ€™t remember if there are any ESP boards that are supported
@jonathanhogg You have also condemned yourself to spending the next four weeks frustratedly fixing the 10k lines of code. Idiot.
@jonathanhogg I write 10000000 lines a minute.
I run shell script commands.
It generates millions of lines of machinecode to execute my wishes.
Amazing!

@jonathanhogg
On one hand, I'm inclined to agree about the barrier to entry issue - boilerplate sucks, and having more people understand programming would be great.

But on the other hand, it feels like the amount of software in existence is already unmanagable, and the average quality is relatively low.

You say to move a layer up to avoid writing 10k lines, but the current way to do that results in huge dependency trees with 10s of thousands of lines of someone else's code.

1/

@jonathanhogg
All these dependencies have updates which introduce regressions and API breakage. And they also have vulnerabilities.

IME, these things can very quickly become unmanagable - you spend more time updating dependencies than writing your own code - unless you're very picky about your dependencies.

So is more people writing more software what the society needs?

@wolf480pl it is the current way of moving up a layer that I object to. We should be thinking of new ways of programming and instead are stuck making new frameworks. We imagine adding more cruft will somehow make it better. Eg., Arduino and Processing imagined that you could take a language wildly unsuited to beginners and make it palatable with a library
@jonathanhogg @wolf480pl To me it looks like Arduino largely succeeded at that.

Note however that it was more than a library. It was also a specialized tooling for that one use case that made it simple. It was also an ecosystem of user-provided libraries that made many tasks simple, typically driving a specific piece of hardware.

And a community of people who wrote about what they did and how to the greatest detail to the point that anyone who could read could replicate it.

And yes, a company seeding and promoting this tooling and libraries at the start to promote their hardware.
โ€‹โ€‹
@bunny but itโ€™s never been easy to learn if youโ€™re not a programmer, and believe me Iโ€™ve tried to teach it; C++ is just fundamentally unsuited to beginners
@jonathanhogg That's the thing. It's difficult to learn all of C++.

To follow a tutorial you would only need to understand a comment saying // I am using pin 3, replace with the pin number you want to use.

And yes, once you wanted to add your own logic you would need to learn some C++. And while taking a very formal approach with some text book or course on everything C++ there is always the option of piecing something together by looking at multiple different examples.

Arduino uses very small subset of C++, and for many use cases you need even much smaller subset to adapt an example you found to do something slightly different.

All the power of C++ is there but you do not need most of it most of the time.

And sure, many people will look at C++ with the bracers an be like "This looks like something I don't understand, I can't do anything with that, look at all those bracers, it reminds me of math". On the other hand, many will try to poke it it and see what happens, especially when they don't know what bracers are anyway, and see them only as a funny squiggle.

I think the really annoying part for people trying to modify a C++ program without knowing what C++ is that in most IDEs including Arduino it's possible to write syntactically invalid code. And I have yet to see a solution for that problem that does not get in the way of writing code. Maybe it would be possible to do something like switching between raw text and WYSIWIG that you get in a lot of web applications that use markdown or similar.
โ€‹โ€‹
@bunny best thing Iโ€™ve used for teaching embedded programming is Microsoft MakeCode, which is explicitly designed to avoid the pitfalls of syntax errors. Itโ€™s basically Scratch for microcontrollers
@jonathanhogg these days my programs tend to be 100 lines of logic, and then 10x error handling and input parsing. But I feel like it's always been this way, even before exceptions.
String parsing is another bottomless pit of despair. I once dreamt of modifying Unix tools to all emit xml (before json). Which means ls and stat end up being the same core program. And sls (the printf version of ls) can get folded in too.
TBH, I find typing helpful, and prefer it (I mostly wrangle Python).

@jonathanhogg

Indeed. These people urgently need to absorb the lesson of DRY: Don't Repeat Yourself.

@jonathanhogg I feel like this misses the point. The point is to not have to spend your life writing 100 lines in a day, when the end goal can be achieved faster and still be as good if not better than hand-made. I am not saying there are no issues and that there is no slop. I am saying it requires mind shift and learning in order for it to not produce slop. Otherwise it will produce results like in your first reply. If coding is just a hobby for you, then none of this matters anyways.

@jonathanhogg Consider this scenario: spend a very long time planning and designing, and then have a very fast code output, then fix any issues.

Also what about projects which can't be made in 30k lines? Doesn't automatically mean that the project is wrong just because it is big.

@warmsignull Unfortunately it seems that Fred Brooks' work is not common knowledge. He concludes that the number of bugs in a program is not linear with the length of a program but a *power function*.

So yes - brevity is a goal. And there have been studies that show that verbose languages produce more bugs. So it is in our best interest as systems engineers to research how to improve programming.

e.g. what is expressed in 30k of Java is not the same as 30k in Lisp.

(cc: @jonathanhogg)

@jonathanhogg I've spent the last decade writing about 50k lines of C++ and it's barely comprehensible to me. Despite regular bouts of significant refactoring and deleting, as old 'new and essential' functions turn out to never be used or the one user goes away.

I did point a downloadable LLM assistant at it but got nothing usable. I was half hoping for a "sure, I'll rewrite that into Rust for you" result ๐Ÿคฃ

@jonathanhogg

"planet-boiling roulette wheel" is the name of my upcoming experimental jazzcore EP

@pikesley @jonathanhogg looking forward to watching them at EMF later this year
@jonathanhogg HyperCard was great.
@StaceyCornelius @jonathanhogg is a descendants/replacement extant now?

@Photo55 @StaceyCornelius @jonathanhogg Livecode is sort of descended from a Hypercard clone (https://livecode.com). And there are a number of runtime engines for old-school Hypercard decks (https://archive.org/details/hypercardstacks?&sort=-downloads). Thereโ€™s also Decker, which is a spiritual inheritor (https://beyondloom.com/decker/).

Dang I miss Hypercard.

LiveCode Create - Build Software You'll Never Outgrow

One platform. Endless power. Build scalable apps with visual development, custom logic, and built-in AI assistance. From idea to launch, your perfect app awaits.

@jonathanhogg HyperCard was *amazing* and I don't understand why there's nothing like it anymore. It was like building programs with Lego. Just snap things together, write your program in a very natural language, and do incredible things. It was so easy to double click on something and add a few lines of code. I remember also having fun with the flexibility of the language and constantly trying to see what different syntax I could get away with.

@fozztexx @jonathanhogg there are quite a few, Lego even have a visual programming language for their smart bricks (I think Python is officially supported, unlike not quite C for Mindstorms).

There's also a visual language for a smart RC/drone controller built by that one guy :) iforce2d is the guy.

But they're generally hard to to anything nontrivial with and very hard to debug. Like Excel/Calc... so easy to have subtle errors even in simple programs that it's considered inevitable.

@jonathanhogg this is my central response to the "AI makes software development accessible" argument.

Once upon a time anyone could program their personal computer using a book that came with it. We taught it to all the kids in my tiny town's elementary school. My shopkeep neighbor and our local mechanic wrote their own custom software with no CS background.

BASIC, Hypercard, personal computers, printed manuals > LLM's.

@requiem @jonathanhogg We still have all that, but itโ€™s 10 different web sites you have to pay a monthly subscription fee to. Every small business that is just one person gets stuck in that web.
@requiem @jonathanhogg I am old enough to remember the days before gcc when I wanted to learn how code everyone laughed and said first I'd need $500 for a copy of the Borland C compiler. It wasn't until linux became popular that I actually had a way to compile my C programs.
@skotchygut I'm pretty sure I still have a "backup copy" of Turbo C++ somewhere... ๐Ÿ˜‡ @jonathanhogg
@requiem @skotchygut believe it or not, I actually learned C on the BBC Micro in the 80s with the Beebug C compiler, which was a strange contraption that compiled to a 16-bit virtual machine code and then interpreted that on the 8-bit 6502

@requiem @jonathanhogg don't forget COBOL and SQL, both invented so businesspeople could dispense with overpriced programmers and just talk to the machine directly.

Turns out that programming people is much easier than programming machines, or at least that yelling at the compiler about an error message is much less useful than yelling at subordinates.

@jonathanhogg
Help us get the federated wiki there.

It is more than a successor in spirit to HyperCard.

You would be surprised to learn about what #FedWiki does.

http://next.ward.dojo.fed.wiki/what-wiki-does.html

What Wiki Does

@jonathanhogg lazarus still exists, as a faint remanent of those times

@jonathanhogg Scratch is excellent. My kid's been using it. I used hypercard at his age and it was a lot fun.

Had it not been because our teacher had acquired two macs into the class, and we could spend time before and after school, I don't think it would have been as fun. It's not just the tools, but also the environment and culture.

@rojun @jonathanhogg Playing with Scratch is definitely fun, even if you're an adult with programming experience already.
@jonathanhogg Have you seen Decker? Itโ€™s a lovely homage to HyperCard: https://beyondloom.com/decker/index.html
@eschaton It looks cute, though curious to build such a faithful homage but ditch the most interesting thing about HyperCard โ€“ the HyperTalk language
@jonathanhogg I think the author would disagree that HyperTalk was the most interesting thing about HyperCard, especially since they put a lot of work into crafting a language they feel is comfortable for such a use. (At least they didnโ€™t just use JS or Luaโ€ฆ)
@eschaton oh yeah, Iโ€™m sure they had their reasons. Interesting โ€œto meโ€ I suppose ๐Ÿ˜‰

@jonathanhogg HyperCard was amazing, and I like to think that if it had still been getting the appropriate support it needed, it would have transitioned really well to the World Wide Web, with stacks being transported to an in-browser runtime plugin that could make calls to data sources on servers.

Even on various modern SaaS platforms, we are still struggling to empower internal staff to build useful stuff that HyperCard could have done so well. The newest generation of that is LLM-powered agentic tools for departments, but even those are turning out to be difficult to build quality products from.

@jonathanhogg well, things started going downhill bc of Steve Jobs:

Atkinson: [At an event at the school our kids went to], he said "you know, Hypercard was really ahead of his time, wasn't it?"

And that was the first time he'd ever said anything good about hypercard. I think before that, the problem is: Hypercard is the reason I didn't go to NeXT, and it had Scully's stink all over it. [...]
https://youtu.be/INdByDjhClU?si=Ft-fFBWBGC5tmC74&t=954

That doesn't explain why the world stayed at the bottom of the hill tho.

Triangulation 247: Bill Atkinson Part 2

YouTube