I'm writing this from a crappy laptop with 2GB of RAM and a dull screen.

https://lemmy.world/post/16166119

I'm writing this from a crappy laptop with 2GB of RAM and a dull screen. - Lemmy.World

But … where is the innovation (and also Alt text?)

Image description.

The image is a screenshot of a tumblr post by user elbiotipo.

My solution for bloatware is this: by law you should hire in every programming team someone who is Like, A Guy who has a crappy laptop with 4GB and an integrated graphics card, no scratch that, 2 GB of RAM, and a rural internet connection. And every time someone in your team proposes to add shit like NPCs with visible pores or ray tracing or all the bloatware that Windows, Adobe, etc. are doing now, they have to come back and try your project in the Guy’s laptop and answer to him. He is allowed to insult you and humilliate you if it doesn’t work in his laptop, and you should by law apologize and optimize it for him. If you try to put any kind of DRM or permanent internet connection, he is legally allowed to shoot you.

With about 5 or 10 years of that, we will fix the world.

Probably an innovative revelation of the concept of “bloat”.
Innovation is orthogonal to code size. None of the software most modern computers are running cannot be solved on 10 year old computers. It’s just the question whether the team creating your software is plugging together gigantic pieces of bloatware or whether they actually develop a solution to a real problem.
Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won’t anybody think of the poor oligarchs?!?

Resources are just way cheaper than developers.

It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.

And if you’re working with code that requires that serious of resource optimization you’ll invariably end up with low level code libraries that are hard to maintain.

… But fuck the Always on internet connection and DRM for sure.

If you consider only the RAM on the developers’ PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that’s pretty desastrous from a sustainability point of view.
But that’s just more business!

Last time I checked - your personal computer wasn’t a company cost.

Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.

When I was last looking for a fully remote job, a lot of companies gave you a “technology allowance” every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.

Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.

It’d be nice to have that - yeah. My company issued me a laptop that only had 16gb of RAM to try and build Android projects.

Idk if you know Gradle builds but a multi module project regularly consumes 20+GB of ram during a build. Despite the cost difference being paid for in productivity gains within a month it took 6 months and a lot of fighting to get a 32gb laptop.

My builds immediately went from 8-15 minutes down to 1-4.

I always felt that this is where cloud computing should be. If you’re not building all the time, then 32GB is overkill.

I know most editing and rendering of TV shows happen on someone’s computer and not in the cloud but wouldn’t it be more efficient to push the work to the cloud where you can create instances with a ton of RAM?

I have to believe this is a thing. If it isn’t, someone should take my idea and then give me a slice.

It’s how big orgs like Google do it, sure. Working there I had 192gb of ram on my cloudtop.

That’s not exactly reducing the total spend on dev ram though - quite the opposite. It’s getting more ram than you can fit in a device available to the devs.

But you can’t have it both ways: you can’t bitch and moan about “always on internet connections” and simultaneously push for an always on internet connected IDE to do your builds.

I want to be able to work offline whenever I need to. That’s not possible if my resource starved terminal requires an Internet connection to run.

Ram is dirt cheap and only getting cheaper.

“Use cloud if available”?
Alternatively they could just use Windows VDI and give you a card + card reader that allows Remote Desktop Connection to avoid this hardware cost, like what my company is doing. Sigh
If the job is fully remote, then the workers could be living on the other side of the country. Using remote desktop with 100ms of latency is not fun.

Or maybe you could actually read the comment you are replying to instead of being so confrontational? They are literally making the same point you are making, except somehow you sound dismissive, like we just need to take it.

In case you missed it they were literally saying that the fact that the real cost of running software (like the AI recall bullshit) is externalized to consumers makes companies don’t give a shit about fixing this. Like literally the same you are saying. And this means that we all, as a society, are just wasting a fuck ton of resources. But capitalism is so eficient hahaha.

But come on man, you really think that the only option is for us to run corporate machines in our homes? I don’t know if I should feel sorry about your lack of imagination, or if yoy are trying to strawman us here. I’m going to assume lack of imagination, dob’t assume malice and all that.

For example, that’s what simple legislation could do. For example, lets say I buy an cellphone/computer, then buy an app/program for that device, and the device has the required specifications to run the software. The companie that sold me that software should be obligated by law to make sure they give me a version of the software that runs in my machne forever. This is not a lot to ask for, this is literally how software worked before the internet.

But now, behind the cover of security and convenience, this is all out of the window. Each new windows/macos/ios/android/adobe/fucking anything update asks for more and more hardware and little to no meaningful new functionality. So we need to keep upgrading and upgrading, and spending and spending.

But this is not a given.

As a developer, my default definition of “slow” is whether it’s slow on my machine. Not ideal, but chimp brain do chimp brain things. My eyes see my own screen all day, not yours.
You can also build a chair out of shitty plywood that falls apart when someone who weighs a bit more sits on it, instead of quality cut wood. I mean, fine if you want to make a bad product but then you’re making a bad product.

Resource optimization has nothing to do with product quality. Really good experiences can be done with shitty resource consumption. Really bad experiences can be blisteringly fast in optimization.

The reason programmers work in increasingly abstract languages is to do more with less effort at the cost of less efficient resource utilization.

Rollercoaster Tycoon was ASM. Slay the Spire was Java. They’re both excellent games.

Yeah, I don’t really have a problem with games except for the stuff added on purpose just to make the user experience worse like DRM. I was more thinking about trends like using Electron for desktop development.
I love the good old games on ASM.

It’s a lot cheaper to have double the ram

yeah a lot cheaper to force someone else to buy double the RAM. No thanks.

Companies don’t pay for your 2x RAM and it doesn’t slow down their user acquisition so they don’t care.
lol pay for someone. If it’s your code, you are that someone.

Companies own the code you write.

It’s not your code if you’re working for a corp - it’s theirs.

If someone else is paying you, you can write sloppy code. Got it.

Psychopath

Just because you don’t own something doesn’t mean you should trash it.

First you insist that companies don’t own the code then you say if you don’t own it you don’t have to care.

God I hope I never work with an idiot like you.

I don’t think I’m saying either of those things. I’m saying the opposite actually.

You seem to be suggesting that even though you are responsible for writing code, the company should hire someone else to optimize it for you.

Resources are just way cheaper than developers.

It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.

I don’t see where you’re reading that idea.

It’s a lot cheaper to double the ram ergo you do not have to pay someone to optimize your code.

Where are you getting this bizarre inverse from?

My point is, developers should be writing optimized code in the first place.
I 100% agree. But, where Linux?
Can current Windows even work with 2GB of RAM?
Yep, minimum 1gig cpu 1 gig ram (32 bit) 2gig ram (64 bit) just don’t expect much out of it lol
Check Windows 10 System Requirements & Specs | Microsoft

Get system requirements, specifications & important details about Windows 10 OS. Also, learn about deprecations, upgrade editions & localization languages.

Windows

This is like the definition of a “conservative”. Progress shouldn’t happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn’t be allowed.

Most games have the ability to downscale so that people like this can still play. We don’t stop all progress just because some people aren’t comfortable with it. You learn to adjust or catch up.

More “conservative” in terms of preserving the planet’s resources.

You don’t need Gigabytes of RAM for almost any consumer application, as long as the programming team was interested/incentivized to write quality software.

It's not really about comfort when you buy software and it doesn't work unless you also buy an $800 hardware upgrade. Especially when it worked fine on the previous version and the only difference is the addition of extraneous features.

I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that’s ridiculous.

On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it’s not a problem to have those features.

If they can downscale enough, they should be able to pass this test.
The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware’s veins. It is not innovation. It is simply a waste of precious resources.

The topic is bloatware, not games.

The original post includes two gaming examples, so it’s actually about both, which is a bit unfortunate, because as you’ve said, they’re two very different things.

It’s the opposite. Limitations foster creativity. Those old computers and game consoles could do amazing things when people wanted to do something. Now you didn’t have to think about what you’re doing, just expect the user to have high end equipment and a super high speed Internet connection. It’s the equivalent to saying you need a trophy truck in order to go over the road you just built because it’s too shitty for a regular car to drive on.

Somebody didn’t live though the “Morrowind on Xbox” era where “creativity” meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.

But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn’t remove them from the game, creating huge bloat in your save file.

Not all creativity is good creativity.

“Limitations foster creativity.”

100% agree. But there’s no reason to limit innovation because some people can’t take advantage of it. Just like we shouldn’t force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000’s…that’s two decades ago.

I’m still gaming on my desktop that I built 10 years ago quite comfortably.

It’s conservationist, reducing hardware requirements to lengthen the lifetime of old hardware.

less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.

I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.

I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.

I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.

whose throwing away stuff every six months, hardware cycles arent even remotely that short, hell, moores law was never that short in the existence of said law. and its not like I dont have my fair share of preventing hardware waste (my litteral job is the refurbishing and resell of computer hardware, im legitimately doing more than the averge person and trying to maintain older hardware several fold). But its not my job to dictate what is fun and whats not. whats fun for you isnt exactly everyone elses definition of fun.
Fuckhuge trucks that roll coal are fun for some people too, but fuck em.

Honestly we are hitting the bugetary limits of what game graphics can do, for example.

A lot of new games look substantially worse than the Last of Us Part 2, which ran on ancient hardware.

One could point to the inclusive or environmental aspect to this approach.

I think that every operating system needs to a have a “do what the fuck I told you to” mode, especially as it comes to networking. I’ve come close to going full luddite just trying to get smart home devices to connect to a non-internet connected network, (which of course you can only do through a dogshit app) and having my phone constantly try to drop that network since it has no Internet.

I get the desire to have everything be as hand-holdy as possible, but it’s really frustrating when the hand holding way doesn’t work and there is absolutely zero recourse, and even less ability to tell what went wrong.

Then there’s my day job, where I get do deal with crappy industrial software, flakey Internet connections and really annoying things like hyper-v occupying network ports when it’s not even open.

Just use Linux?
Tell that to his boss lol
Yeah, I’d love to, but first we have to tell that to Rockwell, Siemens, Bosch, ABB, etc, etc. All the proprietary software runs on Windows. Not to mention getting my company on board when we’re already heavily into the Microsoft ecosystem at the corporate level.
It kind of baffles me that people are still invested in Microsoft at a corporate level considering the costs associated with it.