I think a LOT of people are missing the fact that we got LUCKY with this malicious backdoor.

The backdoor was created by an Insider Threat - by a developer / maintainer of various linux packages. The backdoor was apparently pushed back on March 8th (I believe) and MADE IT PAST all QA checks.

Let me state that again. Any quality assurance, security checks, etc., failed to catch this.

This was so far upstream, it had already gotten into the major Linux distributions. It made it into Debian pre-release, Fedora rolling, OpenSUSE rolling, Kali rolling, etc.

This is an example of Supply Chain Security that CISOs love to talk and freak out about. This is an example of an Insider Threat that is the boogey man of corporate infosec.

A couple more weeks, and it would have been in many major distributions without any of us knowing about it.

The ONLY reason we know about it is because @AndresFreundTec got curious about login issues and some benchmarking checks that had nothing to do with security and ran the issue down and stumbled upon a nasty mess that was trying to remain hidden.

It was luck.

That's it. We got lucky this time.

So this begs the question. Did the malicious insider backdoor anything else? Are they working with anyone else who might have access to other upstream packages? If the QA checks failed to find this specific backdoor by this specific malicious actor, what other intentional backdoors have they missed?

And before anyone goes and blames Linux (as a platform or as a concept), if this had happened (if it HAS happened!!!) in Windows, Apple, iOS, etc.... we would not (or will not) know about it. It was only because all these systems are open source that Andres was able to go back and look through the code himself.

Massive props and kudos and all the thank yours to Andres, those who helped him, to all the Linux teams jumping on this to fix it, and to all the folks on high alert just before this Easter weekend.

I imagine (hope) that once this gets cleaned up, there will be many fruitful discussions around why this passed all checks and what can be changed to prevent it from happening again.

(I also hope they run down any and all packages this person had the signing key for....)

#infosec #hacking #cve #cve20243094 #linux #FOSS

@tinker @AndresFreundTec thank you for taking the time to comment on this, always grateful for your insights, especially on things like this.

@tinker @AndresFreundTec I hope there are enough people looking for what other packets are/were in these sort of state (Single overworked dev) and if they have gotten similar sort of suspicious help.

IE. Did we catch the first one, or did we catch the N:th one that repeated already used playbook.

@JiSe @tinker @AndresFreundTec I'm willing to bet, that it's the N'th time such went through. "Our" software stack has become just too complex and overbuilt.

@datenwolf @JiSe @tinker @AndresFreundTec yes!

it gives me comfort when a new language grows in popularity by merit, and old solutions are rewritten from the ground up

as with #rust being used to modernize parts of the Linux kennel

edit: because there are fresh eyes on it, and the ACT of translation will highlight flaws, hacks, and background that hopefully are not recreated

@beepcheck Could you please add the RustLang tag? 🥰
Background: https://fosstodon.org/@mo8it/112056453394255413
Mo :ferris: :tux: (@[email protected])

I am starting a Mastodon campaign :omya_mastodon: Every time I see a post with only the #Rust tag, I will kindly ask the poster to use #RustLang instead 😇 My feed is full with unrelated content about the film, the game and of course photography of rusty metal 🤬 I will also only post using the #RustLang tag starting from now 😤 You can join me! 🤗 The goal is to let the usage of the #Rust tag vanish so people can only follow #RustLang 😃

Fosstodon
@JiSe @tinker @AndresFreundTec
If you catch a friend in a lie, it's a safe bet that's not the first time they've lied to you.

@JiSe @tinker @AndresFreundTec

After heartbleed in 2014 some things were improved, but not enough, not enough places. Like the Core Infrastructure Initiative by the Linux Foundation.

So we know that lots of projects suffer, I remember Gnu Privacy Guard - which the whole internet depends on, was run by a single developer, not quite making a living comparable to other developers.

@tinker @AndresFreundTec "got curious about login issues and some benchmarking checks that had nothing to do with security" - Paging Clifford Stoll :-)

@thisMattWilson it was just a 75ct error. Just an accounting rounding error.

Turns out to be a German hacker working for the KGB 🙃
Underrated book, always recommending it to fellow IT folks.

@megamatt A great book. I'd also recommend "The Soul of a New Machine" from the same era.
@megamatt I was moderately young when I read it (~18?) and it left me with lots of questions that gradually got answered over time. The studied lack of interest / demarcation disputes from state bureaucracies, for example.
@thisMattWilson same. I had no concept of command lines, scripts, or anything programming. But it always stayed with me somehow.

@tinker @AndresFreundTec

This reminds me of a situation in an early Ubuntu. A pentest showed that a certain innocuous extension was self executing in the webserver like it was PHP. To overcome it, we had to overwrite the system Mime Type entry that for some inexplicable reason had become a PHP executed type instead of a content-type. I tracked down the change in the upstream and couldn't find an explanation for it. I have always wondered if it was a deliberate plant to hack some webserver.

@gvenema @tinker that reminds me of summer of 2000 I think when we downloaded a trojaned copy of openssh from a compromised mirror which replaced the default PAM config with pam_accept.so that allows anything. The password for the day was "d", lol. Found out when someone mistyped their password and logged in anyway, "hmm that's weird" investigation quickly found the issue and it wasn't hard to fix, but people have been trying a long time

@tinker @AndresFreundTec

Of course most Western network appliances were riddled with backdoors and cases of fake cryptography on demand before the opsec community got better at reverse engineering them.

And then there is the case of the Java "export cryptography", which was sort of out in the open, but also it was a much forgotten default setting on a lot of systems in the world.

@tinker it also got into brew and macports, you know the package managers that are almost required to run like 90% of mac dev tools, you just had to have latest python installed via brew to have it delivered on your device. lucky it didn't target macs (for all we know) but who says no package there does currently

@tinker @AndresFreundTec
in windows, back doors and security holes are mandated by corporate...

We only find out about them because hackers find them.

@tinker it’s extremely hard to find malicious code through regular QA and security testing. Although no one wants to hear that.
@Apiary @tinker I mean, especially when someone tries so hard to hide it. I had a small look at it and it's heavyly obfuscated. Also this dude planned this over *years* that's alot of criminal energy. regular QA is there to find regular exploits, I'd say. The money is in the stuff that nobody found yet. Just like how Antivirus is basically useless for anything that has not yet been uncovered.

This is an example of Supply Chain Security that CISOs love to talk and freak out about. This is an example of an Insider Threat that is the boogey man of corporate infosec.

Yeah, but are they willing to exchange money to make it go away? 🦗🦗🦗

@tinker @AndresFreundTec

@some_natalie Imagine coming into a money contact for preventing this kind of issue, and then it happens again (because of all the size and complexity of OSS codebases). Even if someone was able to come up with a clever liability limitation clause for this case, it would still take time and personal risks to test if it really provides enough protection. Meanwhile, M$ will just continue sales with the "as is" clause.

For sure - it's just the alternative to "as is" is absolutely staggeringly expensive.

The average salary of a developer in the US is approximately $120,000 according to the latest StackOverflow survey. Add in benefits and such, let's call it $150k. At 50 weeks a year and 40 hours a week, that's 2000 hours or $75/hour.

Now let's look at how much code a full time engineer could be expected to audit and understand and be responsible for bringing those changes in - not time spent writing code or adding anything, just reading and verifying nothing fishy is going on. 1k lines per FTE hour? 10k? 100k? Does it matter? Can you devote the investment to do this?

Scaling trust is a phenomenally fun and fascinating problem.

@2080 @tinker

@tinker @AndresFreundTec Looking to the future, how do we maintain the security and integrity of these upstream packages to prevent this from happening again? Is it just added vigilance for maintainers, developers and users for code changes, or is there something else that can be done?
@gandalf_the_blue @tinker
Fresh installs and credentials for everything and everyone! Hooray!
@gandalf_the_blue @tinker @AndresFreundTec I think we need to start thinking about the social side of OSS development - it's hardly foolproof, but I am far more inclined to trust people I met in physical space, for instance. And of course there are many caveats and problems here, and this is not foolproof, but I don't think we can, as a rule, just trust people who appear online. Vigilance, yes, but also verification, and working on social chains of trust that are difficult to fake.
@tinker @AndresFreundTec I'm thinking there is a niche here waiting to be filled by an organisation that can receive funding from tech companies and governments (since both of them have vested interests here), hires infosec developers and takes on the burden of verifying the security of these upstream packages, keeping everyone safe.

@tinker @AndresFreundTec

This dates back to 2016 and it's where it smelled already. Quite a few odd bits I don't like. /o\

Get rid of xz, I would say. Scrap it.

https://www.nongnu.org/lzip/xz_inadequate.html

Xz format inadequate for general use

@GNUmatic @tinker @AndresFreundTec

who is using xz for long-term archiving?

@troglodyt @tinker @AndresFreundTec

Maybe anyone who's just tempted to do so by benchmarks and that little bit of extra space savings? 🤔

https://www.rootusers.com/gzip-vs-bzip2-vs-xz-performance-comparison/

Gzip vs Bzip2 vs XZ Performance Comparison

Gzip, Bzip2 and XZ are popular compression tools, but which performs best? Here we benchmark them and compare the trade off between compression ratio and speed.

RootUsers
@tinker @AndresFreundTec I didn’t need to sleep tonight 😅

@tinker In this case tests passed, code that builds with xz still worked, and a fuzzer had a patch by the same actor to hide their change. What other QA is there? We typically trust upstream.

Patches from this actor to other projects appear to be being reviewed, and one - to libarchive - has already been seriously reworked.

And yes, supply chain is hard, but this is a great example of the benefits of open source. Many, many people are working on this and reviewing code.

@tinker @AndresFreundTec So uhm... ackshually it *was caught* by the cybercrap and all. Valgrind *did catch it*.

But the UX of it, and the programming languages, and all the rest of the tooling, was so shit, that the rational answer was not to fix it or go find what was fishy but to *deactivate the check*.

The things to change? Actually building ergonomic tools adapted to the task, instead of yelling at everyone about supply chain.

@tinker @AndresFreundTec
This seems more like a nation-state / criminal org attack (the What) — NOT an insider threat (the How).
@tinker @AndresFreundTec repeat after me: open source packages are not my supply chain unless i have a contract with their maintainer
@whitequark @tinker @AndresFreundTec I have a funny feeling there's a lot of security people at a lot of companies doing their best to NOT send an "I TOLD YOU SO" email to their CTO.
@whitequark @tinker @AndresFreundTec Well, you could resolve not to use OSS without such a contract, but you'd be pretty unusual -- and I'm not sure how you'd get a contract with the maintainers of, say, every JS dependency of something like React (which commonly does run server-side to speed up initial page load these days, and could do ... all sorts of malicious things in that context).
@rst @whitequark @tinker @AndresFreundTec react / npm / nodejs/ electron is a whole nother nightmare lol
@crissi @rst @whitequark @tinker @AndresFreundTec brb signing a support contract with the left-pad maintainer
@rst @tinker @AndresFreundTec unless you do have one, you do not have a supply chain.

@whitequark @tinker @AndresFreundTec The usual definition of "supply chain" is all the places where you get your code -- whether a contractual relationship exists, as in, say, the SolarWinds attack, or not, as in the current xz attack or the case described below. And consequences for the victims are the same either way, so focusing on having legal paperwork is a distraction, not a defense

https://www.reversinglabs.com/blog/more-malicious-npm-packages-found-in-wake-of-jumpcloud-supply-chain-hack

More malicious npm packages found in wake of JumpCloud supply chain hack

ReversingLabs researchers uncovered evidence of more malicious npm packages beyond those already disclosed — and conclude that the attack is still active.

ReversingLabs

@rst @tinker @AndresFreundTec that's not the usual definition of a "supply chain"; that's the definition of a "supply chain" that CISOs came up with as an attempt to pawn off their responsibility onto unpaid labor

literally anywhere else your "supply chain" consists of organizations you're paying in exchange for services and labor

@rst @tinker @AndresFreundTec you're making a mistake thinking I'm not aware of the rhetorical trick you're employing. I am. I simply do not acknowledge it as legitimate
@whitequark @rst @tinker @AndresFreundTec The supply chain doesn’t exist. It’s just people moving things.
@whitequark
That... Does not accurately describe why they use the term at all. It might be useful for you to understand why they do, but you don't seem super-interested.
@rst @tinker @AndresFreundTec

@dymaxion @whitequark @rst @tinker @AndresFreundTec understandably, they are pushing against creating a liability relation with OSS maintainers and users in spirit of all those no guarantee clause

Those who are the supply chain of OSS are integrators and commercial entities reselling the software, c.f. CRA in EU

@raito @dymaxion @rst @tinker @AndresFreundTec correct. I understand why they do; it just doesn't match why they *say* they do.

@raito
Why would CISOs want to create a liability relation with entities that by definition have no damn money?

That term is first aimed at internal risk management infrastructure, which understands supply chain risk more generally, to make and communicate the problem and make resources appear. Secondarily, it's aimed at commercial software vendors, who do have money and need to get their shit together. Third, it's a term the security community as a whole uses to think about the problem.

Independent FOSS devs are part of the software supply chain in exactly the way rocks are part of the mineral supply chain, for better and worse.
@whitequark @rst @tinker @AndresFreundTec

@dymaxion @raito @rst @tinker @AndresFreundTec
well, we still have time and labor.

and yep! that last paragraph is exactly the bit I have a problem with. from the perspective of an independent FOSS dev, the whole concept exists to make my life worse with no upside for it. since it doesn't seem like CISOs care about not making my life worse, I see no reason to spend effort acknowledging their motivation; only pushing back against it.

@whitequark
Basically, you get the shitty choice between engaging with the process and working for solutions to the societal problem of FOSS insecurity that take as little effort as possible on your part and actually work, or the world is likely to drop a liability regime in your lap that won't fix the problem but will cost you a lot of time and money.

Because software insecurity is now a society-level concern, and because FOSS insecurity is a nontrivial part of that, opting out isn't an option. Saying that the relationship doesn't exist etc. is just giving up any seat at that table.

Any CISO who isn't an idiot does care about your life getting harder, because they want your work output to exist and while their first priority is the security of their org, their zeroeth priority is the success of their org, which requires FOSS code. Amy CISO who isn't a sociopath puts good outcomes for society before either, which requires a similar balance between insecurity and your life — but even the sociopaths should care. Unfortunately, like executive sociopaths, there are a lot of idiots out there. Giving up your seat still doesn't help.
@raito @rst @tinker @AndresFreundTec

@dymaxion You're trying to sneak in this premise of a "problem of FOSS insecurity" like it's a given, but I'm pretty sure lots of people (including me) would fundamentally disagree that such a problem exists at all.

@dymaxion

I think the premise your introducing here is backwards.

It's not a FOSS developers concern if some or all corporations take their software, exploit and extract value from it, use it without understanding its limitations, and then incur risk on top of it.

That's the corporations fault and responsibility.

FOSS developers create software for our own use and the use of others. The inherent social contract is constrained WITHIN THE BOUNDS of understanding the mutual aid and limitations of liability.

So... yay for corporations doing what corporations do, but its not FOSS's problem.

The corporations are fully welcome to develop their own software solutions extracting and exploiting their own human "resources".

@whitequark @raito @rst @AndresFreundTec

@whitequark
Re: upside, you are also part of society, and are therefore also harmed by software insecurity. Ignoring externalities related to your work is equally bad form for you, even though you don't have the direct capital force multiplier effect door you choices.

And yes, you do have time and labor, but it's become a colonia-ishl relationship.
@raito @rst @tinker @AndresFreundTec

@dymaxion @whitequark @raito @rst @tinker @AndresFreundTec this was rad to read, thanks all involved. Not that it matters but on balance I think I'm still team @whitequark on this one - FOSS developers don't owe capitalism anything (but I also can see that leads to a crappy outcome - just slightly less crappy than alternatives.)

@dymaxion @raito @whitequark @rst @tinker @AndresFreundTec and our answer is that this risk management chain and security world are not equipped to deal with us, as is showing by having to use an analogy that is problematic to the point of harming the point being made.

We will answer the calls when they are useful. We do not have the time until then.

@Di4na
Cool.

Enjoy your personal liability for indirect damage.
@raito @whitequark @rst @tinker @AndresFreundTec