@aparrish not disagreeing with you, but the important point about the 'open' models is that unlike online services, the open/local models provide a minimum baseline for capability.
Apart from all of the other terrible things, there's an absolutely horrendous risk involved in getting locked in to an AI service behind an API which can be arbitrarily changed or removed.
@aparrish @flippac @whitequark (the environmental damage and the theft are externalities that do still bother us, personally, we think that goes beyond personal choice. we're just leaving that aside for the sake of focusing on something else right this moment)
that makes sense, yeah
They probably meant that you don't hurt anyone's profits if something goes wrong
@aparrish Yeah, I think there's a lot of danger in trusting these tools... and probably more so the more you trust and less you understand the piles of slop they're cranking out and are capable of debugging or fixing it if need be...
I guess if one's baseline for software is "it can break or go away at any time, taking all your data with it unrecoverably" (because it's a mysterious black box), maybe that's okay... but I sure don't think it is.
@alice @aparrish I write documentation and knowledge base articles using HTML, which is ultimately what it was originally built for, creating documents at CERN easy enough for physics researchers, who were not primarily IT people, to write. Its still perfectly well suited to this task. I'm pretty sure I could teach my 9yo to write a basic article using HTML, and back in the day lots of people of varying skill level made MySpace and GeoCities pages with just HTML.
I happen to think that Markdown-style is even easier for knowledge management, and I've never written Gopher to compare against, but Markdown doesn't have the features to make webapps, and HTML/CSS/JS has pretty much completely replaced native platform toolkits for GUI app development.
I also develop some basic webapps too, not React-style SPAs because as you say the goalposts are on Mars, and basic forms are still doable, esp with all the things modern HTML5 input elements provide that don't require any JS to use, like the pattern attribute for client-side validation. Most things which are actually useful for a computer to do dont require the latest technology, personal computers were essentially feature complete by the end of the 2000s, they just need someone who cares to learn and to not get hung up trying to compete with million dollar commercial development teams. CGI still exists and is viable if you dont need to care about monopolistic scalability.
@aparrish around the time web 2.0 was getting into full swing, I read a thinkpiece by some Very Smart Web Guy who said "HTML is a terrible language for making web pages" and I'm still mad about it
I'm also still mad at David Siegel for unleashing his "creating killer web sites" book upon the world, because I feel like it was almost single-handedly responsible for the mindset that a web site is supposed to be a beautiful shiny brochure above all

🖼️Cover Photo: Train at the Nairobi terminus of the Mombasa–Nairobi Standard Gauge Railway. It runs parallel to the Uganda Railway that was completed in 1901. The first fare-paying passengers boarded the "Madaraka Express" on Madaraka Day (1 June 2017), the 54th anniversary of Kenya's attainment of self-rule from Great
@aparrish I also think this attitude (of the web being way too complex) is weird with various improvements to HTML, CSS, Javascript, various web APIs, the "evergreen" browsers being much more homogeneous, etc.
I firmly believe that the solution to something becoming hard to understand because of giant piles of middleware and libraries and frameworks is not *more* middleware and autogenerated piles of slop on top of it... but rather the opposite.
Encourage junior software engineers to learn how to vibe instead of how to think, make their employers dependent on your LLM, and when things go wrong, they can consult with your small staff of SW engineers who learned how to think — for a premium…
@Red_Shirt_no2 @texttheater @aparrish well yeah. nobody designs car engines for fun today
and please note that there was a time when people did
@aparrish That was very much my reaction too. The difficulty of building a 90s-era website is very much the same now as it was back then - it's still basically the same HTML and CSS under the hood (except, perhaps, for the dreaded marquee).
In some ways it's easier because's there's a bunch of free hosting services and tutorials, in some ways it's harder because it's hard to wade through the slop to find good tutorials (although notably, this is very much the fault of AI).
What has changed is the expectations. If you want your website to look like a "modern" commercial website, then yeah, doing that with simple HTML is hard, and agentic AI incidentally appears to be good at this style (to the point where I suspect a poorly-functioning site in that style is slop). But if you think that's the only way to go about things, then, well... let's just say Neocities wants to have a word with you
@brib @aparrish i think you're basically right about "expectations", but imo it's not (just?) about the look
the "large company commercial website" tooling _really_ did make *some* things easier (in exchange for usually making something else harder)
i remember how much _excitement_ there was (whether artificially amplified or not) as language-specific package managers (e.g. npm, although there was bower for a bit) and javascript bundlers (most prominently browserify back then, probably webpack nowadays) first took off, and made *that specific style* of code reuse way easier. people were doing things like "i can import an entire 3d graphics framework (three.js) and physics library (ammo.js?) and make something with it"
documentation certainly hasn't kept up, and i'm still generally looking for ways out of the current mess