PC speed gains erased by modern software
PC speed gains erased by modern software
Teams eats up my macos memory something awful and it’s just sitting there… no one saying anything.
wtf I hate it
Hasn’t this always been the case? Software development is a balance between efficiency of code execution and efficiency of code creation. 20 years ago people had to code directly in assembly to make games like Roller Coaster Tycoon, but today they can use C++ (or even more abstract systems like Unity)
We hit the point where hardware is fast enough for most users about 15 years ago, and ever since we’ve been using faster hardware to allow for lazier code creation (which is good, since it means we get more software per man-hour worked)
which is good, since it means we get more software per man-hour worked
In the same way that more slop is good for the hog trough
As long as hardware performance keep increasing, developers would take advantage of it and keep sacrificing performance in exchange for better developer UX. If given a choice between their app using 10x memory vs their app taking 10x less time to develop, most devs would choose the latter, especially if their manager keep breathing behind their neck. The only time a developer would choose to make efficient, but longer to develop apps is usually where the a developer has final say about the project, which usually means small personal side projects, or projects in a company led by technical people who refuse to compromise (which is rare).
Once the hardware performance plateau, we’ll see resurgence of focus on improving application performance.
The real sad part for me is the amount of e-waste this produces. Especially in devices like laptops.
A clean Linux distro can extend a laptops life by a decade. I have a laptop from the c2d era that I threw an ssd in and put Linux on. Perfectly serviceable as a basic machine.
I don’t typically see computers getting replaced because they are “slow” but because of factors more related to support like apple dropping support for older models just because of reasons. Or companies doing routine system replacements because their fleet of computers are getting banged up and damaged and since they are XYZ years since their release and are no longer in production their costs and complexity with repair, maintenance time, etc… Are no longer worthwhile. Not to mention, the more insecure members of our species feel their self worth is dependent on how fancy and new their things are.
If you want to point fingers at why performance for operating systems and programs has declined over the years I would say it’s mostly in part due to security functions.
I have an ancient early W7-era AMD dual core (bulldozer based? So it’s actually like 1.5c) that runs just peachy on Lubuntu LTS, 8gb of mismatched used Ram I got for free, and an ancient slow HDD.
I use it for D&D (Roll20 and Foundry) and MTG (Cockatrice).
I tested W10 with a ReadyBoost sd card and 2gb RAM and it barely worked.
So what I want to know is why do we still have programs that run on a single core when nearly every Windows PC out there is running a multi-core processor?
What are we missing to have the OS adapt any program to take advantage of the hardware?
For applications developed natively, the response times would be expected to be quite good, but fewer applications are developed natively now including things that might seem like they otherwise would be. Notepad, for example, is now based on UWP.
UWP creates native Apps though?
Well, that was unexpected. I recorded a couple of crappy videos in 5 minutes, posted them on a Twitter thread, and went viral with 8.8K likes at this point. I really could not have predicted that, given that I’ve been posting what-I-believe-is interesting content for years and… nothing, almost-zero interest. Now that things have cooled down, it’s time to stir the pot and elaborate on those thoughts a bit more rationally. To summarize, the Twitter thread shows two videos: one of an old computer running Windows NT 3.51 and one of a new computer running Windows 11. In each video, I opened and closed a command prompt, File Explorer, Notepad, and Paint. You can clearly see how apps on the old computer open up instantly whereas apps on the new computer show significant lag as they load. I questioned how computers are actually getting better when trivial things like this have regressed. And boom, the likes and reshares started coming in. Obviously some people had issues with my claims, but there seems to be an overwhelming majority of people that agree we have a problem. To open up, I’ll stand my ground: latency in modern computer interfaces, with modern OSes and modern applications, is terrible and getting worse. This applies to smartphones as well. At the same time, while UIs were much more responsible on computers of the past, those computers were also awful in many ways: new systems have changed our lives substantially. So, what gives?