When I was young, I learned, and was taught, how to make the computer to work efficiently and correctly, in my computer science degree.

Now it is the opposite. Do brute-force search using giant farms of computers, using a huge amount of energy and water, and get results that are not guaranteed to be correct any more.

And I was discussing with a colleague this morning that my 2001 laptop ran faster than my current top-range computer for everyday tasks. Of course, it had a much worse CPU and much less ram. And of course the software for things we still do *now* was much faster *then*.

I still have that laptop from that time running Ubuntu 4.10 from 2004 in my personal museum of computers. You would be amazed how responsive the system is for everything we do every day with a computer. I recently tested it with my son, because he was curious to see how things were then.

So we are using more powerful hardware for getting a poorer experience.

The new computers are much better for some things, such as running Agda. But, for everything else I happen to do, they were just as fast, because people programmed them in a more efficient way (they had to - there was no other way).

@MartinEscardo

I remember when you could still install the old NCSA Mosaic binary on some newer hardware, which I think isn't possible anymore with chip changes. Anyway, a coworker challenged me to try it and I was like sure.

I cannot begin to describe how fast it was. Eyeblink fast. Screamingly fast. Comically fast -- I literally guffawed when it loaded one page. Like the whole webpage was a JPG it just loaded from disk. And of course a bunch of stuff didn't work, but most of it was crap I didn't want anyway.

@MichaelTBacon @MartinEscardo You know what drives me nuts? Volume control.

Once upon a time, you could change volume INSTANTLY. It was just a cheap potentiometer. If something was TOO LOUD, you could turn the pot and THAT PROBLEM was solved just like that.

But on a modern laptop? Or "smart" phone? Or almost anything? Yeah, good luck turning down the volume in a humanly reasonable amount of time.

The CPU can do like millions of things per second, but not one of them is turning down the volume

@isaackuo @MichaelTBacon @MartinEscardo As corporate tech enshittifies, I hope the smallscale hacker tech that is increasingly and delightfully able to take its functional place really embraces bringing back physical buttons and knobs, and I have some genuine optimism it will.
@isaackuo @MichaelTBacon @MartinEscardo My computer's audio is plugged into an old ghetto blaster.

@ernestoDuracelli @isaackuo @MichaelTBacon @MartinEscardo I have a couple of maybe prosumer grade audio things connected to my desktop computer. They have proper knobs, including volume knobs!

Nowadays I rarely do anything requiring talking to other people on the internet on my free time, but maybe I could set up my instrument microphone for that, too… And have the physical level knob.

@pare @isaackuo @MichaelTBacon @MartinEscardo A friend of mine has been using a Singstar microphone on Teamspeak back in the days. Which meant he was playing with only one hand a lot. These things don't have volume control though I think.
@isaackuo @MichaelTBacon @MartinEscardo It’s fast to change volume on iPhones (fast reaction to holding volume down button and rapid change of volume).
On Macbooks there is mute button but I guess that other non-apple laptops have it too?

@isaackuo @MichaelTBacon @MartinEscardo

What drives me nuts with the volume landscape, is that there's now like 11 steps of different volumes you can choose from. That cheap potentiometer allowed you to choose from practically infinite amount of suitable volumes for you.

@henrik @isaackuo @MichaelTBacon @MartinEscardo
Likewise!

We have these powerful pocket computers that wirelessly communicate across the globe, but what you can't do is set the volume just so. In louder volumes it's usually less of an issue, but in the area just above hearing levels it's maddening.

And, the overall clumsiness.

@MichaelTBacon @MartinEscardo

You should try the @dillo browser sometime. It's a nice refuge from... all this nonsense. 😄

@scops @MichaelTBacon @MartinEscardo @dillo

That's practically a modern browser 😄

It even has (a teensy bit of) Javascript support!

(Kidding aside, NetSurf is pretty great. I just wish it had more keyboardy features. Dillo has fairly customizable keybinds, so you can make it almost as keyboardy as luakit/Qutebrowser/ff+vimium, but not quite)

@MartinEscardo I'm a fan of deterministic computing myself.

@MartinEscardo

I feel bad for saying it, but I often tell my friends, "Computer Science has reached its rabid-orangutans-smearing-feces-on-the-wall level of devolution."

I truly cannot comprehend the level of brainrot and insanity that is SOP these days. Even before all of this "A.I." giga-grifter nonsense.

I'm sad for my friends still in the field. And sadder that it has become such a caricature of itself.

@rl_dane @MartinEscardo the whole "curl [url-quoted-by-rando-foreign-remote-anon-strangers-incented-to-betray-you-and-filter-for-gullibility] | sudo bash" trend just blows my frickin mind.

@synlogic4242 @MartinEscardo

And the official I.T. equivalent,
"Just install the container. I'm sure someone is keeping it up-to-date and secure. It'll be fiiiiiine"

@rl_dane @MartinEscardo and then... Enter The Vibe Coding. "its so awesome I can ask Claude to whip up my own personal ideal Time Tracking app for me!" Consequence: the biz now has 1000 completely distinct Time Tracking app codebases in use, each with wildly different arch and third party lib/service dependencies, and..." *nightmare*
@rl_dane @MartinEscardo yeah I have a love/hate relationship with Docker and their image ecosystem because of it. images and containers are wise ideas and net wins in some cases when "just so" and they they are horribly unwise in others. too many folks dont seem to see the nuance of those distinction boundaries. they can bite

@synlogic4242 @MartinEscardo

"Nuance?!? What's that! Maximum convenience go BRRRR!!"

@rl_dane @MartinEscardo lately I feel like I'm witnessing millions of people who, all at once, and who are demonstrably lazy or ignorant or reckless, or all three, decide it would be incredibly wise for them to start juggling chainsaws

"This... will not end well."

@MartinEscardo Even back then there were horrible examples. See Windows 2000, which was able to make slow any fast machine of the time. Although Microsoft had a great idea: label it as "Professional". Every "serious" user had to go with a deadly slow machine, but running a professional OS! 😅
@luc0x61 @MartinEscardo It was my experience that while Win 2000 was slow to boot, it was actually really nice and fast once it was up and running if you had enough RAM. If you didn’t, though… woof. Also, unlike the 9x/ME series, it didn’t take the entire computer down in a spectacular crash regularly 😅
@trezzer @MartinEscardo I think that with Win 9x was also a matter of applications and services. I managed to work with a few desktops/laptops on Win 98SE for years, without any great issue, taking care to maintain a clean system. Then I remember of a McAfee antivirus that thrashed all the system, and needed a reinstall.
Also, later on I had a very reliable Win XP laptop where a Siemens PLC suite forced me to reinstall everything.
I even manage to have currently W11 PCs very clean and reliable
@MartinEscardo I'm writing book on HPC. but I do worry about a possible shrinking market demand by people who might ever care anymore. I am gambling there will always be a slice, and niches, where it will always matter
@MartinEscardo Way back in the day, when undergraduate projects were in Fortran IV, the big issue in computer science was how to make calculations more efficient. But they gave up on that and threw memory and power at it. Bloatware running on bloatmachines.
@allypally @MartinEscardo I remember the big improvement was a hot card reader at the service desk
@MartinEscardo I do feel like the entire field of Computer Science has completely abandoned the principle of efficiency.
@unfmeghan @MartinEscardo is that something computer scientists are doing? I thought it was the software businessists making everything worse, while computer scientists plug away at theories and rarely make software that’s more than a proof-of-concept

@MartinEscardo

Yeah I remember how long I put off updating Microsoft word, because the new version was 30 MB and that just seemed so ridiculously bloaty

It was the size a hard drive used to be, a few years previous!

Goodness knows what size it is now. I haven't used it in about 25 years

@NilaJones @MartinEscardo Remember when you opened your 25k MSWord files and they suddenly expanded to 250k? That was the beginning of the end.
@ELS @NilaJones @MartinEscardo meanwhile, the humble plaintext file sitting off in the corner: "a kilobyte is... a bit excessive, no?  " 
@NilaJones @MartinEscardo Nothing it's all in the cloud. Sure you got five seconds input lag but no footprint in your memory. /s
@MartinEscardo outside of the slop thing, that's quite the rose-tinted-glasses take
@toroidalcore @MartinEscardo This is soooo true! I've kept saying that this vibe coding nonsense can be short-circuited if you just employ one of the people whose code the LLM is ripping off!
Generative AI doesn’t copy art, it ‘clones’ the artisans — cheaply

The early machines at the beginning of the Industrial Revolution produced ‘cheap’ (in both meanings) products and it was the introduction of that ‘cheap’ category that was a…

R&A IT Strategy & Architecture
@fionasboots @toroidalcore @MartinEscardo It'll be interesting to see where this goes now that they're starting to ramp up the cost of using it. Using Claude with GitHub will increase in cost by a factor of 18 in June (assuming their new token pool is equivalent to today).
@veronica 18 times more expensive! Wow, that is going to hurt!
@toroidalcore @MartinEscardo
Nvidia exec says AI is more expensive than actual workers — yet some companies don't see the extra costs as a negative

It's easy to point and laugh, but the picture might be more nuanced than it seems.

Tom's Hardware

@TheMNWolf like I've been saying, tech companies will pay anything to avoid having employees

in other news, hire me

@WizardOfDocs such as using a technology to replace them that is so incompetent that it requires employees on hand to fix what it produces.

If I were hiring, I probably would. Skepticism is one of the hallmarks of a good troubleshooter.

Parkinson (Cyril Northcote) noted that work expands to fill all time available for its completion.

Similarly with software vs. hardware resources.

@MartinEscardo

@MartinEscardo @Gargron retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// retro:// future:// ad ∞

@MartinEscardo

And when you buy some new tech nowadays, it seems that you don't own it, but you are subordinate to it's commands.

(Just returned a new phone after several hours of fight with the beast. )

I once booted MS-DOS on my 2011-ish crappy laptop that I use specifically for testing things. Was amazed by how instant everything is. There's no perceptible input lag at all. Things like dir take zero time as far as I'm concerned. Games launch instantly as well.

This made me wonder how fast would Windows 9x be on modern hardware.

@grishka @MartinEscardo
Yes, text is always faster than GUI, because 1 byte is 1 character but in graphics 1 byte is 1 pixel, so everyone wanting pretty pics on their web-pages was the first step in the big slow-down (in the early days I used Lynx mostly, unless it was necessary to have pics for what I was doing), and then along came video on web...

💡𝚂𝗆𝖺𝗋𝗍𝗆𝖺𝗇, a 2011 CPU is fast enough that this difference shouldn't matter. I'm sure its memory bus bandwidth is enough to transfer an entire VGA framebuffer every frame, many times over.

GPUs are really good at rendering images and videos very fast, and all modern web browsers use hardware-accelerated rendering as much as possible. So it's not that either. I'd blame modern OSes and their schedulers.

@MartinEscardo I completely agree, and I still strive to write lightweight, fast and responsive code, and take pride at that.

For anyone debating me on this, I point them to demoscene. Namely "elevated", ".the.product", and "debris".

@MartinEscardo There is a quote attributed to someone named Henry Petrosky that I think about a lot:

"The most amazing achievement of the computer software industry is its continuing cancellation of the steady and staggering gains made by the computer hardware industry."

@antsu @MartinEscardo The hardware guy giveth and the software guy taketh away.
@MartinEscardo This is something that bothers me in general because it means hardware that is perfectly fine gets left behind by virtue of modern software or even just websites being too much, despite not providing much more to justify the extra required power.

It
especially bothers me with phones. What do you mean the app launcher is running slow on this old phone? Do the background services really need to use up all the resources when they do almost the same stuff as back when the phone was new?

@MartinEscardo @Gargron Also today I noticed yet again that most webpages seem to be optimised for phone screens, meaning that UI elements are comically big when visited with a desktop browser.

This I find particularly annoying as there was a time when webpages used frameworks that would display them correctly and efficiently based on the browser and device you used to visit them. Desktop, phone, tablet was all taken care of and elements displayed appropriately. Even UI frameworks are lazy now