Everything in modern computing seems driven by performance graphs for software (and firmware) that is full of security vulns, the theory being that this is okay because mitigations can get applied later before (too many) users are permanently harmed. Ideally minimal fixes that fix each individual bug as they are found, as narrowly as possible, thus not moving the benchmarks to maintain maximum performance and vulnerability.

Your computer is designed for harm and performance, not your safety, at this time.

You may think hey that is unfair it is not designed for harm. But there are choices made every day to not make the system safer, so it is a design choice.

Would you be ok with a bridge that was designed to fall apart slowly with a plan to continually patch it after anything broke, because this cost your govt less money? And then parts of the bridge roadway would fall off at times, maybe while people were driving on it. They would quickly repair those within a few days with a “fast patch” and there would be articles praising them for acting quickly to protect drivers from the holes in the bridge. But would that not be designing for harm? Because there are other choices that would avoid that, which we see in bridge building. But that is how computers currently look from inside security teams.

https://www.cisa.gov/cisa-director-easterly-remarks-carnegie-mellon-university

@blinkygal

There's a 1992 paper/keynote by Nancy Leveson [1] that I think about sometimes, in which she compares software development to steam engines. In the 19th century, people developed high-pressure steam engines despite not having the metallurgical, engineering and manufacturing process knowledge for how to make boilers that don't occasionally explode, killing people. It took almost a century to really solve the problem.

She was arguing that software and steam engines share similar relationships between economic usefulness, technological limitations, and safety, and she supposed that software might follow a similar trajectory as we improve engineering practices. For boilers, regulation was a part of it.

I think the comparison still holds up, over 30 years later.
[1] http://sunnyday.mit.edu/steam.pdf

@kenrb Oh wow this could literally be about software.

- Complaining that “the risk is being exaggerated”.
- “The intense discussion of defects and safety risks has clouded the issue of its advantages and has ‘disgusted the industrial community’”
- The lack of standardized training, even within a company/field to work on specific safety-critical product.
- Expecting users to use the software correctly to not have it explode on them (opt-in hardened modes).
- Trying to solve it by having people inspect them without changing their design/construction. Just Do It Better.
- Large and growing number of affected people and property loss.

It’s remarkable this was the first US regulation of private industry. I hope software will follow to the lessons learnt part soon, not stick to the things going wrong part.

@blinkygal @kenrb

@blinkygal @kenrb a big difference is steam engines and tunnels don't have agents from hostile nations bombing them daily. We don't blame the engineers for the work of the zappers.

I'm scared of the ecosystem consequences of more regulation around software security, especially for smaller firms. Can regulation even keep up with adversaries? Lots of big questions here.

@nsa @kenrb I don’t know that it has to be so different in regards to the nature of the problem.

It is certainly different in that intelligent adversaries reduce the effectiveness of some prevention or containment approaches. For example, a sandbox contains software vulns but in the presence of an intelligent adversary the chaining of multiple failures together is a simple matter (given they know how to use them) rather than probability or causal relationships playing a large role.

Regulation is given as a bad solution in Leveson’s paper and I don’t disagree but the better solution is industry solving these problems by putting safety first, and 21 years after that paper was published things look worse rather than better. So I hope for regulation anyway because it has a chance to improve things, and that is a role of govt, to force incentives that do not arise naturally in business.

The same patterns of industry chasing profits/performance and pushing the cost and blame of faults onto users occurred in the auto industry too and regulation saved lives there just as with steam engines. Jen Easterly’s talk (linked above) does a great comparison there.

@nsa @blinkygal @kenrb you’re scared that businesses profiting from harm might… what, be less profitable when forced to be less harmful?
@blinkygal @kenrb She is also the author of Engineering A Safer World, which is a must-read https://direct.mit.edu/books/oa-monograph/2908/Engineering-a-Safer-WorldSystems-Thinking-Applied
Engineering a Safer World: Systems Thinking Applied to Safety

A new approach to safety, based on systems thinking, that is more effective, less costly, and easier to use than current techniques.Engineering has experie

MIT Press

@blinkygal
But regulation of software comes with its own set of problems, particularly regarding open source. Check this by @bert_hubert on the EU Cyber Resilience Act:

https://berthub.eu/articles/posts/eu-cra-practicalities/

@kenrb

EU Cyber Resilience Act part two: Updates & Impracticalities - Bert Hubert's writings

This is a living document - I’d normally spend a few days polishing everything, but since CRA talks are ongoing right now, there’s simply no time for that. Check back frequently for updates! Also please let me know urgently on [email protected] if you think I’m reading things incorrectly! As a follow-up to my earlier post on the EU Cyber Resilience Act, here I’d like to address some practicalities: how would it actually work.

Bert Hubert's writings
@kenrb @blinkygal the thing is, unlike those boiler folks, we know quite well how to do this. However, it takes time and effort (and thus, money), so it's just not happening (enough) because incentives are elsewhere.

@vriesk @kenrb Yes! So it is a design choice being made, now, rather than unknown unknowns.

We have learnt a lot in 21 years since the comparative paper on steam engines was published, and attackers make use of new knowledge. Software vendors keep trying to find silver bullets to avoid changing their designs.

Jen Easterly has this right I think in her two-decade-later followup. Incentives have to change.

@blinkygal @kenrb design choice, yes, but as in "process design choice", as there many aspects involved, and software design being one of them, there's also a whole richness of technology choices, and then the choice of due processes as well.

And then there's the problem that the whole software engineering world depends on a huge amount of legacy software that really deserves to be rewritten/hardened. And who's gonna do that?

@vriesk @kenrb Thanks for the thoughtful comment, I've been mulling over what I think here.

Yeah it is a process design choice, with a ton of inertia into using tools that existing software was built with which then propagates the same security properties into new code as well.

And you're absolutely right, there's a whole world of legacy software, most of the large software we use day-to-day falls into that category. Some small things are being rewritten (https://www.memorysafety.org/). But there's vast amounts of code that is not.

Part of me says it doesn't matter for the purpose of the above; users are harmed, thousands of people lose control of their digital lives every year, some end up in jail or at risk in other ways for it, millions or billions are lost in ransoms. The point in some sense is just that this is happening, that vendors are aware of it, and users don't really know what's being given to them and that it was a choice.

Right now there's not the right level of acknowledgement of the problem, in my opinion, outside of CISA and the intelligence agencies. So of course nothing is moving to fix it in a hurry.

If the vendors who are writing on top of legacy software started investing into rewriting it into memory safe languages, or hardening it with tools within the same language, things would start to get better, maybe fast enough that regulation and liability laws wouldn't be needed to protect people.

Prossimo

ISRG's Prossimo is moving critical software to memory safe code.

Prossimo

@blinkygal @kenrb I agree, but I would like to stress out that memory safety is just one aspect of the general software safety/security; on the strictly technical level, there's also concurrency safety (both on the scale of threads and distributed systems), whole aspect of proper authentication and authorization management, and then the more general aspect of logic safety - making sure the tools used are used properly.

1/2

@blinkygal @kenrb As in, no memory and thread safe software stack will protect you from logic flaw that contains something like "if a person wearing a pink hat kindly asks for money, give them all we have".

Those type of issues can't be ruled out, but proper development process designs can make them less likely. This also costs time and effort, naturally.

2/2

@blinkygal @kenrb adding to that, there are technical choices to be made that either help or hamper writing correct (business logic-wise) software, and one of the reasons I consider type-unsafe languages like Python or JavaScript, and languages with poor expressivity like Go, to be poor choices for general purpose programming.
@vriesk @kenrb While that is true, the vast majority of security flaws, and the most useful ones, that I see in my work are memory safety bugs. Without tackling those there’s little hope for system integrity.
@blinkygal I live in the US. That IS how bridges are built and maintained.
@blinkygal That's all true, but it's more a function of every org constantly focusing on maximising returns.

@joshoconnorchen Yes indeed, and theres an implication inside there of what incentives exist for them to maximize within. Business makes extremely poor decisions if left to optimize without incentives that reflect the wellbeing of people. I think we see this in a lot of domains right now, and it will require outside influence to course correct.

I use “Business” here in the meaning where Business exists for extracting profits and Industry exists for production and meeting society’s needs. I can’t find the excellent writeup that led me to this distinction to cite right now, maybe once full text search drops.

@blinkygal While I wholeheartedly agree, it's useful to remember that modern computing is still like life that just crawled on land for the first time. Most technology is about who was able to grow a leg first, not who is actually ideal for the task at hand. Or leg.