PC speed gains erased by modern software

https://lemmy.world/post/963429

PC speed gains erased by modern software - Lemmy.world

Interesting take on comparability vs performance. I gotta imaging capturing user data and sending to a cloud collector is also a big culprit.

Massive amounts of telemetry data and nearly every app these days just being a web app just chews through your hardware. We use Teams at work and it’s just god awful. Hell, even steam is a problem. Even having your friends list open can cause a hot to your fps in some games.

Teams eats up my macos memory something awful and it’s just sitting there… no one saying anything.

wtf I hate it

When I login to my work laptop in the morning, before even doing anything, Teams is the app using the most resources - typically 300MB-400MB, while doing nothing.
There’s also the load from having fancy graphics, like transparency and fading window transitions.
twas always thus, software development is gaseous in that it expands to take up all the area it is placed inside, this is both by the nature of software engineering taking the quickest route to solving any action, as well as by design of collusion between operating system manufacturers (read Microsoft and Apple) and the hardware platform manufacturers they support and promote. this has been happening since the dawn of personal computer systems, when leapfrogging processor, ram, hard drive, bus, and network eventually leads to hitherto improbably extravagant specs bogged down to uselessness. it’s the bane and very nature of the computing ecosphere itself.
The problem is also that developers have to add more and more fancy features that's enabled by the new tech. Twenty years ago, a calculator app didn't need to have nice animations on its buttons, but these days this is expected.

Hasn’t this always been the case? Software development is a balance between efficiency of code execution and efficiency of code creation. 20 years ago people had to code directly in assembly to make games like Roller Coaster Tycoon, but today they can use C++ (or even more abstract systems like Unity)

We hit the point where hardware is fast enough for most users about 15 years ago, and ever since we’ve been using faster hardware to allow for lazier code creation (which is good, since it means we get more software per man-hour worked)

which is good, since it means we get more software per man-hour worked

In the same way that more slop is good for the hog trough

As long as hardware performance keep increasing, developers would take advantage of it and keep sacrificing performance in exchange for better developer UX. If given a choice between their app using 10x memory vs their app taking 10x less time to develop, most devs would choose the latter, especially if their manager keep breathing behind their neck. The only time a developer would choose to make efficient, but longer to develop apps is usually where the a developer has final say about the project, which usually means small personal side projects, or projects in a company led by technical people who refuse to compromise (which is rare).

Once the hardware performance plateau, we’ll see resurgence of focus on improving application performance.

The real sad part for me is the amount of e-waste this produces. Especially in devices like laptops.

A clean Linux distro can extend a laptops life by a decade. I have a laptop from the c2d era that I threw an ssd in and put Linux on. Perfectly serviceable as a basic machine.

I don’t typically see computers getting replaced because they are “slow” but because of factors more related to support like apple dropping support for older models just because of reasons. Or companies doing routine system replacements because their fleet of computers are getting banged up and damaged and since they are XYZ years since their release and are no longer in production their costs and complexity with repair, maintenance time, etc… Are no longer worthwhile. Not to mention, the more insecure members of our species feel their self worth is dependent on how fancy and new their things are.

If you want to point fingers at why performance for operating systems and programs has declined over the years I would say it’s mostly in part due to security functions.

I have an ancient early W7-era AMD dual core (bulldozer based? So it’s actually like 1.5c) that runs just peachy on Lubuntu LTS, 8gb of mismatched used Ram I got for free, and an ancient slow HDD.

I use it for D&D (Roll20 and Foundry) and MTG (Cockatrice).

I tested W10 with a ReadyBoost sd card and 2gb RAM and it barely worked.

So what I want to know is why do we still have programs that run on a single core when nearly every Windows PC out there is running a multi-core processor?

What are we missing to have the OS adapt any program to take advantage of the hardware?

To run something on multiple cores you need to detect a bunch of different tasks it is doing that don’t depend on one another. Then you can execute each task in its own thread. The problem is that most often these different task don’t exist, or, if they do, figuring them out automatically by the code is likely equivalent to solving the halting problem, that is it’s undecidable and there can’t exist a program that does this.
Multi-threaded programming is hard. You can’t just write some code and expect it to work across 4 cores, you need to know what to parallelise and how to do it. If you think normal bugs are hard to fix, just wait until you have a calculation that gives a different answer each time you run it thanks to race conditions.
goddammit. I was watching this going “hey, my system is like that!” Check and yes, my 24 core Ryzen 5900X with 32GB ram with NVMe drive is painfully slow opening things like calculator, terminal etc. I am running Fedora 38 with KDE desktop… That the hell man
Single thread performance matter more when your metric is the speed of opening calculator, terminal and other apps. However, the 5900x has pretty good single thread performance already, roughly 1.5x faster than my old processor (i7-4790) and opening apps is pretty fast in my case (<1 second). Something is probably wrong with your setup. Perhaps you accidentally set your desktop power mode to “power saver” instead of “performance”?
Opening a calculator/terminal should be on the order of 1ms or even faster. 1s is absurd
Alas, I’m using gnome so ~1s calculator launch is the best I can get 😬
Yeah there does seem to be something wrong now that I can see it. Now just need to work out what it is…

For applications developed natively, the response times would be expected to be quite good, but fewer applications are developed natively now including things that might seem like they otherwise would be. Notepad, for example, is now based on UWP.

UWP creates native Apps though?

Stop buying games that need 220gb of drive space, an Nvidia gtx 690000 and a 7263641677 core processor then. More than a 60gb download sizes means I pirate it unless it’s a really really damn good game.
My laptop was running slow until I blocked windows 10 phoning home 3000 times per day. Looking at you browser.pipe.aria.microsoft.com
This is a short write-up on a much longer blog post, so if you didn’t click the link embedded in the article text, I recommend you read Julio’s original blog post.
Fast machines, slow machines - Julio Merino (jmmv.dev)

Well, that was unexpected. I recorded a couple of crappy videos in 5 minutes, posted them on a Twitter thread, and went viral with 8.8K likes at this point. I really could not have predicted that, given that I’ve been posting what-I-believe-is interesting content for years and… nothing, almost-zero interest. Now that things have cooled down, it’s time to stir the pot and elaborate on those thoughts a bit more rationally. To summarize, the Twitter thread shows two videos: one of an old computer running Windows NT 3.51 and one of a new computer running Windows 11. In each video, I opened and closed a command prompt, File Explorer, Notepad, and Paint. You can clearly see how apps on the old computer open up instantly whereas apps on the new computer show significant lag as they load. I questioned how computers are actually getting better when trivial things like this have regressed. And boom, the likes and reshares started coming in. Obviously some people had issues with my claims, but there seems to be an overwhelming majority of people that agree we have a problem. To open up, I’ll stand my ground: latency in modern computer interfaces, with modern OSes and modern applications, is terrible and getting worse. This applies to smartphones as well. At the same time, while UIs were much more responsible on computers of the past, those computers were also awful in many ways: new systems have changed our lives substantially. So, what gives?

Julio Merino (jmmv.dev)