Everything in modern computing seems driven by performance graphs for software (and firmware) that is full of security vulns, the theory being that this is okay because mitigations can get applied later before (too many) users are permanently harmed. Ideally minimal fixes that fix each individual bug as they are found, as narrowly as possible, thus not moving the benchmarks to maintain maximum performance and vulnerability.

Your computer is designed for harm and performance, not your safety, at this time.

You may think hey that is unfair it is not designed for harm. But there are choices made every day to not make the system safer, so it is a design choice.

Would you be ok with a bridge that was designed to fall apart slowly with a plan to continually patch it after anything broke, because this cost your govt less money? And then parts of the bridge roadway would fall off at times, maybe while people were driving on it. They would quickly repair those within a few days with a “fast patch” and there would be articles praising them for acting quickly to protect drivers from the holes in the bridge. But would that not be designing for harm? Because there are other choices that would avoid that, which we see in bridge building. But that is how computers currently look from inside security teams.

https://www.cisa.gov/cisa-director-easterly-remarks-carnegie-mellon-university

@blinkygal

There's a 1992 paper/keynote by Nancy Leveson [1] that I think about sometimes, in which she compares software development to steam engines. In the 19th century, people developed high-pressure steam engines despite not having the metallurgical, engineering and manufacturing process knowledge for how to make boilers that don't occasionally explode, killing people. It took almost a century to really solve the problem.

She was arguing that software and steam engines share similar relationships between economic usefulness, technological limitations, and safety, and she supposed that software might follow a similar trajectory as we improve engineering practices. For boilers, regulation was a part of it.

I think the comparison still holds up, over 30 years later.
[1] http://sunnyday.mit.edu/steam.pdf

@kenrb Oh wow this could literally be about software.

- Complaining that “the risk is being exaggerated”.
- “The intense discussion of defects and safety risks has clouded the issue of its advantages and has ‘disgusted the industrial community’”
- The lack of standardized training, even within a company/field to work on specific safety-critical product.
- Expecting users to use the software correctly to not have it explode on them (opt-in hardened modes).
- Trying to solve it by having people inspect them without changing their design/construction. Just Do It Better.
- Large and growing number of affected people and property loss.

It’s remarkable this was the first US regulation of private industry. I hope software will follow to the lessons learnt part soon, not stick to the things going wrong part.

@blinkygal @kenrb

@blinkygal @kenrb a big difference is steam engines and tunnels don't have agents from hostile nations bombing them daily. We don't blame the engineers for the work of the zappers.

I'm scared of the ecosystem consequences of more regulation around software security, especially for smaller firms. Can regulation even keep up with adversaries? Lots of big questions here.

@nsa @kenrb I don’t know that it has to be so different in regards to the nature of the problem.

It is certainly different in that intelligent adversaries reduce the effectiveness of some prevention or containment approaches. For example, a sandbox contains software vulns but in the presence of an intelligent adversary the chaining of multiple failures together is a simple matter (given they know how to use them) rather than probability or causal relationships playing a large role.

Regulation is given as a bad solution in Leveson’s paper and I don’t disagree but the better solution is industry solving these problems by putting safety first, and 21 years after that paper was published things look worse rather than better. So I hope for regulation anyway because it has a chance to improve things, and that is a role of govt, to force incentives that do not arise naturally in business.

The same patterns of industry chasing profits/performance and pushing the cost and blame of faults onto users occurred in the auto industry too and regulation saved lives there just as with steam engines. Jen Easterly’s talk (linked above) does a great comparison there.

@nsa @blinkygal @kenrb you’re scared that businesses profiting from harm might… what, be less profitable when forced to be less harmful?