GOG is seeking a Senior Software Engineer with C++ experience to modernize the GOG GALAXY desktop client and spearhead its Linux development

https://lemmy.ml/post/42259354

Penguinz0 (cr1tikal) Yeah Baby! That’s what I’ve been waiting for!

YouTube
Wake me up when it becomes a Foss launcher
GitHub - Heroic-Games-Launcher/HeroicGamesLauncher: A games launcher for GOG, Amazon and Epic Games for Linux, Windows and macOS.

A games launcher for GOG, Amazon and Epic Games for Linux, Windows and macOS. - Heroic-Games-Launcher/HeroicGamesLauncher

GitHub
Isn’t that a different project?
Also: Electron
Okay, in other words: I won’t be buying any more Steam games :)
You don’t need GOG galaxy to install and run GOG games. In fact you shouldn’t if you care about keeping your games.
Currently happily using Heroic to manage GOG games. But, I still welcome GOG putting in effort to make it a smooth experience.
Heroic Games Launcher

An Open Source Epic, GOG and Amazon Prime Games Launcher

Yes and the DRM free part only matters if you keep a copy of the installer. Galaxy doesn’t do that.

the DRM free part only matters if you keep a copy of the installer. Galaxy doesn’t do that.

Why would that be relevant on Linux? WINE/Proton virtual environments are portable.

File compression, for starters.

File compression, for starters.

You can compress folders and entire file systems.

A dedicated installer is much easier to bring around.

For one game, maybe. For a bunch of games an automated backup that collects the entire library and save games is much more practical. There are several easy to use solutions, not to mention scripting if you want really fine grained control.

tar -Jcf DIY-dedicated-installer.xz /path/to/wine/bottle

Now you have a very portable, highly compressed file that is easy to move around.

Yes. Not using a launcher equals fewer barriers. GOG installers work out of the box with Wine. The whole point of GOG is literally that you can do all of that without restrictions like say… Being forced to use a launcher. So it’s not a big deal if Galaxy for Linux isn’t around.
wait. why? i use gog galaxy for gog games. and steam for those there. should i be dloading offline installers for gog ones and saving them aside too?
If you want the benefit of a DRM game, yes. Otherwise you still don’t own the game. GOG has removed games from libraries before and will again at some point in the future.
Which games have they removed from libraries? Typically, these storefronts (including GOG) will remove games from sale, but not from the libraries of customers who already bought them. For instance, they deep discounted WarCraft 1 and 2 before Microsoft requested their delisting, but I’ve still got them in my library.
That’s some kind of fallacy, I am sure. Just because I want to own my games I must not care about the hassle of installing them? False equivalence maybe?
Have you tried Heroic Games Launcher and found it to be a hassle?
It was indeed quite a hassle, but worth it. That’s completely beside the question though
If you care this much about not using Steam, why would this be the deciding factor? I can play GoG games right now on Linux.

I tried that some time ago, and at least at that point it needed configuration to get up and running. It was a hassle. I have family that needs a lot of my time at the moment. Between August and December I could find less than 10 days where I was able to decide by myself what I do after workdays or on weekends.

I’m not going to spend those precious minutes configuring any damn thing. Steam works out of the box. Now someone was just mentioning something called Heroic launcer. Sounds good. Wonder why Gog is not linking to it very visibly on its site if it works?

Yeah, I understand Linux gaming was not great until fairly recently. Heroic and Lutris have been around for a few years now as I understand it, though I admit I’m still new to the scene. Honestly though, in my experience, Steam still doesn’t just work in some cases, and I’ve admitted to myself that’s just going to be the Linux gaming experience. I check Proton DB before buying any game now.

It strikes me odd that Heroic doesn’t want to be available with apt, though! It’s even advertising that it is intentionally packaged in a way that duplicates pre-existing libraries – apparently to just take some extra place from my hard drive for fun?!

Doesn’t really wake much trust in them caring about how to use a computer’s resources. Whether one wants to be afraid of two applications sharing a library file or not should be left for the user to decide… And it’s not very nice that there an increasing number of ways applications can be installed, and these clever people are supporting that development… How am I supposed to have any overlook over what’s installed on my computer? This is starting to feel like Windows :(

I don’t really believe it’s very good for computer security that applications are installed without anything in the OS keeping track of whether they need security updates or not!

Okay, in other words: I won’t be buying any more Steam games 🐳

So far this is only about one person and none of the ecosystem contributions to Mesa, SDL, Wine,…

Definitively better than nothing, though!

I wonder what they’ve been doing in the meantime when a Linux native client was the most requested feature for so long.
GOG was recently bought from CDPR and is now owned by one of the co-founders, if I remember right. The focus shift towards giving any fucks about Linux as a platform likely has something to do with that.
CDPR is the game dev studio. Their parent company, CD Projekt was who owned GOG. CDPR had nothing to do with it.
Right, thanks. I always get them mixed up

there’s a lot to be excited for, but

Job requirements
[…]

  • Active use of AI tools in daily development workflows, and enthusiasm for helping the team increase adoption

ew.

This is a “big part” of my job. In five months what I’ve accomplished is adding AI usage to jira along with a way to indicate how many story points it wound up saving or costing. Let’s see how this plays out.

If AI collapses as many expect it to, this job will still be there without that requirement.

I hope the bubble pops soon, and only smaller and more sustainable models stay
Agreed, AI has uses but c-suite execs have no idea what they are and are paying millions to get their staff using them in hopes of finding what those uses are. In reality they are making things worse with no tangible benefit because they are all scared that someone will find this imaginary golden goose first.

Yeah, self-hosted open-source models seem okay, as long as their training data is all from the public domain.

Hopefully RAM becomes cheap as fuck after the bubble pops and all these data centers have to liquidate their inventory. That would be a nice consolation prize, if everything else is already fucked anyway.

Unfortunately, server RAM and GPUs aren’t compatible with desktops. Also, NVidia have committed to releasing a new GPU every year, making the existing ones worth much less. So unless you’re planning to build your own data centre with slightly out-of-date gear - which would be folly, the existing ones will be desperate to recoup any investment and selling cheap - then it’s all just destined to become a mountain of e-waste.

Maybe that surplus will lay the groundwork for a solarpunk blockchain future?

I don’t know if I understand what blockchain is, honestly. But what if a bunch of indie co-ops created a mesh network of smaller, more sustainable server operations?

It might not seem feasible now, but if the AI bubble pops, Nvidia crashes spectacularly, data centers all need to liquidate their stock, and server compute becomes basically viewed as junk, then it might become possible…

I’m just trying to find a silver lining, okay?

I wonder if the Server gpus can be used for other tasks than computing llms

Google Stadia wasn’t exactly a responding success…

From a previous job in hydraulics, the computational fluid dynamics / finite element analysis that we used to do would eat all your compute resource and ask for more. Split your design into tiny cubes, simulate all the flow / mass balance / temperature exchange / material stress calculations for each one, gain an understanding of how the part would perform in the real world. Very easily parallelizable, a great fit for GPU calculation. However, it’s a ‘hundreds of millions of dollars’ industry, and the AI bubble is currently ‘tens of trillions’ deep.

Yes, they can be used for other tasks. But we’ve just no use for the amount that’s been purchased - there’s tens of thousands of times as much as makes any sense.

So there would be an enormous surplus and a lot of e-waste. That’s a shame, but that’s going to happen anyway. I’m only saying that the silver lining is that it means GPU and RAM would become dirt cheap (unless companies manufacture scarcity like the snakes they are).

Industrial applications aren’t the only uses for it. Academic researchers could use it to run simulations and meta-analyses. Whatever they can do now, they could do more powerfully with cheap RAM.

Gamers who self-host could render worlds more powerfully. Indie devs could add more complex dynamics to their games. Computer hobbyists would have more compute to tinker with. Fediverse instances would be able to handle more data. Maybe someone could even make a fediverse MMO. I wonder if that would catch on.

Basically, whatever people can do now, more people would be able to do more powerfully and for cheaper. Computations only academia and industry can do now would become within reach of hobbyists. Hobbyists would be able to expand their capacities. People who only have computers to tinker with now would be able to afford servers to tinker with.

“Trickle-down” is a bullshit concept, as everything gets siphoned to the top and hoarded. But when that cyst bursts, and those metaphorical towers come crashing down, there’s gonna be a lot of rubble to sift through. It’s going to enable the redistribution of RAM on a grand scale.

I’m not pretending it’ll solve everyone’s problems, and of course it would have been better if they had left the minerals in the ground and data centers had never grown to such cancerous proportions. But when the AI bubble bursts and tech companies have to liquidate, there’s no denying that the price of RAM would plummet. It’s not a magic bullet, just a silver lining.

I would imagine any program running simulations, rendering environments, analyzing metadata, and similar tasks would be able to use it.

It would be useful for academic researchers, gamers, hobbyists, fediverse instances. Basically whatever capabilities they have now, they would be able to increase their computing power for dirt cheap.

Someone could make a fediverse MMO. That could be cool, especially when indie devs start doing what zuck never could with VR.

Like AI, blockchain is a solution in search of a problem. Both have their uses but are generally part of overcomplicated, expensive solutions which are better done with more traditional techniques.

Maybe I didn’t mean blockchain, cause I’m still not really certain what it is. I mean like the fediverse itself, or a mesh network, where a bunch of hobbyist self-hosting their own servers can federate as a system of nodes for a more distributed model.

Instead of all the compute being hoarded in power-hungry data centers; regular folks, hobbyists, researchers, indie devs, etc., would be able to run more powerful simulations, meta-analyses, renderings, etc., and then pool their data/collaborate on projects, and ultimately create a more efficient and intelligently guided use of the compute instead of simply “CEO says generate more profit! 24/7 overdrive!!!”

At the very least, a surplus of cheap RAM would expand the computing capabilities of everyone who isn’t a greedy corporation with enough money to buy up all the expensive RAM.

I read I think just last week but for sure in the last month that someone has created an AI card that lowers power usage by 90%. (I know that’s really vague and leaves a lot of questions.) It seems likely that AI-specific hardware and graphics hardware will diverge — I hope.

I think it’s called an inferencing chip. I read about it a few months ago.

Basically, the way it was explained, the most energy-intensive part of AI is training the models. Once training is complete, it requires less energy to make inferences from the data.

So the idea with these inferencing chips is that the AI models are already trained; all they need to do now is make inferences. So the chips are designed more specifically to do that, and they’re supposed to be way more efficient.

I kept waiting to see it in devices on the consumer market, but then it seemed to disappear and I wasn’t able to even find any articles about it for months. It was like the whole thing vanished. Maybe Nvidia wanted to suppress it, cause they were worried it would reduce demand for their GPUs.

At one point I had seen a smaller-scale company listing laptops for sale with their own inferencing chips, but the webpage seems to have disappeared. Or at least the page where they were selling it.

Wether you (or I) like it or not, Pandora’s box has been opened. There is no future in software development without the use of LLMs.
I appreciate your opinion, but I don’t believe you.
👍

Enjoy.

Say hi to the PMs and QA for me.

While this might be true, there’s a big difference in using LLMs for auto-completions, second opinion PR reviews, and maybe mocking up some tests than using it to write actual production code. I don’t see LLMs going away as a completion engine because they’re really good at that, but I suspect companies that are using it to write production code are realizing/will soon realize that they might have security issues and that for a human to work on that codebase it would likely have to be thrown away entirely and redone, so using slop it only costed them time and money without any benefits. But we’ll see how that goes, luckily I work at a company where managers used to be programmers so there’s not much push for us to use it to generate code.
It’s so weird, i read this in a bunch of jon listings nowadays. How the fuck is it a requirement?!?! You should be fluent in CPP, but also please outsource your brain and encourage the team to do so as well. People are weird man.

Yeah, what does GOG know?

The real source of wisdom is social media users who approach a topic with bad faith, outrage farming framing. I mean just look at the upvotes, and you can easily tell how right you are, it’s basically science.

I’m sorry the only way you know how to write code is with an LLM holding your hand, but I believe if you really devote yourself into it you could learn to be a real programmer. Good luck!

And we open the book of troll arguments to chapter 1: Ad hominem

Keep going, it really makes you look like the rational one.

Maybe try a red herring next, or a straw man those are always popular.

Bruh, your only “rebuttal” was a straw man and an appeal to authority. Make a better argument before you go accusing people of being trolls.

Oh ok.

‘The job listing does not say anything about outsourcing your brain.’

But, everyone knows that because it is obvious on the face.

The subtext, as always, isn’t about commenting on the subject of the article or even making any kind of cognizant point that could actually be rebutted. Much like the top comment, it is just running ‘ai bad’ through an LLM so that it fits the post.

Would you honestly say that the comment that I responded to was made in good faith?

Why did you attack the commenter personally? Are you not able to defend the idea without stooping so low?
Clearly you didn’t read the conversation because they were less inaulting and dumb than the peraon they replied to. Why are you so interested in defending trolls?