I think a LOT of people are missing the fact that we got LUCKY with this malicious backdoor.

The backdoor was created by an Insider Threat - by a developer / maintainer of various linux packages. The backdoor was apparently pushed back on March 8th (I believe) and MADE IT PAST all QA checks.

Let me state that again. Any quality assurance, security checks, etc., failed to catch this.

This was so far upstream, it had already gotten into the major Linux distributions. It made it into Debian pre-release, Fedora rolling, OpenSUSE rolling, Kali rolling, etc.

This is an example of Supply Chain Security that CISOs love to talk and freak out about. This is an example of an Insider Threat that is the boogey man of corporate infosec.

A couple more weeks, and it would have been in many major distributions without any of us knowing about it.

The ONLY reason we know about it is because @AndresFreundTec got curious about login issues and some benchmarking checks that had nothing to do with security and ran the issue down and stumbled upon a nasty mess that was trying to remain hidden.

It was luck.

That's it. We got lucky this time.

So this begs the question. Did the malicious insider backdoor anything else? Are they working with anyone else who might have access to other upstream packages? If the QA checks failed to find this specific backdoor by this specific malicious actor, what other intentional backdoors have they missed?

And before anyone goes and blames Linux (as a platform or as a concept), if this had happened (if it HAS happened!!!) in Windows, Apple, iOS, etc.... we would not (or will not) know about it. It was only because all these systems are open source that Andres was able to go back and look through the code himself.

Massive props and kudos and all the thank yours to Andres, those who helped him, to all the Linux teams jumping on this to fix it, and to all the folks on high alert just before this Easter weekend.

I imagine (hope) that once this gets cleaned up, there will be many fruitful discussions around why this passed all checks and what can be changed to prevent it from happening again.

(I also hope they run down any and all packages this person had the signing key for....)

#infosec #hacking #cve #cve20243094 #linux #FOSS

@tinker @AndresFreundTec repeat after me: open source packages are not my supply chain unless i have a contract with their maintainer
@whitequark @tinker @AndresFreundTec Well, you could resolve not to use OSS without such a contract, but you'd be pretty unusual -- and I'm not sure how you'd get a contract with the maintainers of, say, every JS dependency of something like React (which commonly does run server-side to speed up initial page load these days, and could do ... all sorts of malicious things in that context).
@rst @tinker @AndresFreundTec unless you do have one, you do not have a supply chain.

@whitequark @tinker @AndresFreundTec The usual definition of "supply chain" is all the places where you get your code -- whether a contractual relationship exists, as in, say, the SolarWinds attack, or not, as in the current xz attack or the case described below. And consequences for the victims are the same either way, so focusing on having legal paperwork is a distraction, not a defense

https://www.reversinglabs.com/blog/more-malicious-npm-packages-found-in-wake-of-jumpcloud-supply-chain-hack

More malicious npm packages found in wake of JumpCloud supply chain hack

ReversingLabs researchers uncovered evidence of more malicious npm packages beyond those already disclosed — and conclude that the attack is still active.

ReversingLabs

@rst @tinker @AndresFreundTec that's not the usual definition of a "supply chain"; that's the definition of a "supply chain" that CISOs came up with as an attempt to pawn off their responsibility onto unpaid labor

literally anywhere else your "supply chain" consists of organizations you're paying in exchange for services and labor

@whitequark
That... Does not accurately describe why they use the term at all. It might be useful for you to understand why they do, but you don't seem super-interested.
@rst @tinker @AndresFreundTec

@dymaxion @whitequark @rst @tinker @AndresFreundTec understandably, they are pushing against creating a liability relation with OSS maintainers and users in spirit of all those no guarantee clause

Those who are the supply chain of OSS are integrators and commercial entities reselling the software, c.f. CRA in EU

@raito
Why would CISOs want to create a liability relation with entities that by definition have no damn money?

That term is first aimed at internal risk management infrastructure, which understands supply chain risk more generally, to make and communicate the problem and make resources appear. Secondarily, it's aimed at commercial software vendors, who do have money and need to get their shit together. Third, it's a term the security community as a whole uses to think about the problem.

Independent FOSS devs are part of the software supply chain in exactly the way rocks are part of the mineral supply chain, for better and worse.
@whitequark @rst @tinker @AndresFreundTec

@dymaxion @raito @rst @tinker @AndresFreundTec
well, we still have time and labor.

and yep! that last paragraph is exactly the bit I have a problem with. from the perspective of an independent FOSS dev, the whole concept exists to make my life worse with no upside for it. since it doesn't seem like CISOs care about not making my life worse, I see no reason to spend effort acknowledging their motivation; only pushing back against it.

@whitequark
Basically, you get the shitty choice between engaging with the process and working for solutions to the societal problem of FOSS insecurity that take as little effort as possible on your part and actually work, or the world is likely to drop a liability regime in your lap that won't fix the problem but will cost you a lot of time and money.

Because software insecurity is now a society-level concern, and because FOSS insecurity is a nontrivial part of that, opting out isn't an option. Saying that the relationship doesn't exist etc. is just giving up any seat at that table.

Any CISO who isn't an idiot does care about your life getting harder, because they want your work output to exist and while their first priority is the security of their org, their zeroeth priority is the success of their org, which requires FOSS code. Amy CISO who isn't a sociopath puts good outcomes for society before either, which requires a similar balance between insecurity and your life — but even the sociopaths should care. Unfortunately, like executive sociopaths, there are a lot of idiots out there. Giving up your seat still doesn't help.
@raito @rst @tinker @AndresFreundTec

@dymaxion You're trying to sneak in this premise of a "problem of FOSS insecurity" like it's a given, but I'm pretty sure lots of people (including me) would fundamentally disagree that such a problem exists at all.