Linux Kernel Rust Code Sees Its First CVE Vulnerability
Linux Kernel Rust Code Sees Its First CVE Vulnerability
Which is worse?
Surely if X > 0 then this is still a net improvement?
After writing this comment I noticed it became a bit ranty, sorry for that. Something about this article rubbed a bit in the wrong way.
The relevant section seems to be this:
Browser engines and garbage-collected runtimes are classic examples of code that fights the borrow checker. You’re constantly juggling different memory regions: per-page arenas, shared caches, temporary buffers, objects with complex interdependencies. These patterns don’t map cleanly to Rust’s ownership model. You end up either paying performance costs (using indices instead of pointers, unnecessary clones) or diving into unsafe code where raw pointer ergonomics are poor and Miri becomes your constant companion.
The first half is obviously correct, this kind of data model doesn’t work well for the ownership model rust uses for its borrowchecker. I don’t like the conclusion though. Rust makes you pay the performance costs necessary to make your code safe. You would need to pay similar costs in other languages if you intend on writing safe code.
Sure, if you are fine with potential memory corruption bugs, you don’t need these costs, but that’s not how I would want to code.
The other thing bugging me is how miri being your companion is framed as something bad. Why? Miri is one the best things about rusts unsafe code tooling. It’s like valgrind, or sanitisers but better.
Now, the raw pointer ergonomics could be better, I’ll give them that. But if you dive deep into what rust does with raw pointers, or rather what they are planning to do, is really really cool. Provenance and supporting cheri natively is just not possible for languages that chose the ergonomic of a raw integer over what rust does.
Rust by default will not allow you to make certain kinds of errora, which is great. But if you are doing something advanced, down at the hardware level, you might need to disable those defaults in order to write the code you need. This is what people mean by “unsafe” – lacking the normal memory safeguards.
With careful coding, “unsafe rust” or normal C, for that matter, can be free of bugs and safe. But if programmers make a mistake, vulnerabilities can creep in more easily in the unsafe sections.
But if you are doing something advanced, down at the hardware level
This part is wrong. Otherwise yes correct.
The “unsafe” code in rust is allowed to access memory locations in ways that skip the compiler’s check and guarantee that that memory location has valid data. They programmer is on their own to ensure that.
Which as you say is just the normal state of affairs for all C code.
This is needed not because of hardware access but just because sometimes the proof that the access is safe is beyond what the compiler is able to represent.
Thank you for the correction, I’ll edit my comment.
sometimes the proof that the access is
safe is bevond what the compiler is able to represent
Could you say a few more words about this? In what situations do you have to write ‘unsafe-tagged’ code blocks? Could this be changed by improvements to the compiler? Or is it necessitated by the type of task being done by the code?
Because Rust lets you choose when something is unsafe vs writing all unsafe in code all the time:
Note the other 159 kernel CVEs issued today for fixes in the C portion of the codebase
Memory safe languages that are not garbage collected are not all that common. Ada and Rust are two examples.
With great care C++ and zig can be.
I’m sure there’s a good reason a lot of the big players and the community at large have picked up rust though. Docs, error messages, cargo community etc.
I would argue that Rust does bring a lot to the table. I certainly would never code in C for work but I’ll happily reach for Rust.
Boone? There are plenty of fan boys out there that are selling rust like AI, or in other words snake oil.
Rust obviously has built in securities that C doesn’t have, but a shitty coder is a shitty coder and bad QC is bad QC. Now we’re seeing the reality of the consequences.
Rust and/or other memory safe® languages are like the future, but hopefully more people are now seeing the cracks. Just look at cloudflare for a prime example.
The software had a limit on the size of the feature file that was below its doubled size. That caused the software to fail.
this is not a rust problem… nor was the original problem of code writing entries to a file multiple times, and nor is the thing that made it worse: propagation of the poisoned file
It certainly does. It’s way simpler to keep up all invariants and to review the few percent lines of code in unsafe blocks than making sure that the 30 million lines of code are free of undefined behavior.
Is some part of the code it battle tested of course it’s complete unreasonable to rewrite it in any language, but if you have to do a major rewrite anyway or write a new component you should definitely use a memory safe language. And no it doesn’t matter that there is an unsafe escape hatch because it’s literally impossible to write low level kernel stuff or FFI to other languages without unsafe because Rust can’t uphold/guarantee the invariants of the hardware itself or the behavior of other languages.
I’ll admit, I haven’t looked at the code. I would stand by my comment of the unsafe block being a start point.
Countering that however, what is the difference to just debugging effectively? Not sure. I suppose it’s down to the people that identified it and fixed it at the end of the day to say if there was any benefit.
No. The issue is that an assumption they make in the unsafe block does not actually always hold true. They changed the safe rust code to strenghten the assumption they made in the first place, because that is way easier than rearchitecting the unsafe part. I.e. if the unsafe part was somehow to be written safely, the mitigation they introduced now would not result in any difference in behaviour, it would be correct behaviour both before and after.
Tldr: the problem lies in the unsafe part