Security issues broadly can be boiled down into 2 categories
- Seriously mindblowing 0 days no one even considered that shock and amaze you with the hackers thinking
- Developers that took shortcuts to meet some kind of deadline
Btw that second category isn't the fault of developers but more the external pressure on devs from project stakeholders that place shipping functionality over considering security impacts
Just a dev vowing to shift left does nothing when project managers want to see features on app, you need a fundamental shift in your development culture that enables and rewards developers taking longer to ship something out, but that is built with security in mind
@insiderphd That’s was the core of my BSides London talk a few years back. Effective Shifting Left is about working more closely with Testers and Product Managers (or whatever you org chooses to call them) not lumping ‘and security’ onto the already full plate of a full stack dev.

@insiderphd Found it!

How we did successful shifting left at scale, by ignoring all the standard advice and methodologies

Featuring ....

- Me presenting somebody else's slides after only seeing the talk once, the day before, on the train.

- Me waking everyone up my punching the mic with my flappy hands at an unexpected point.

- Far too many Top Gear references (see "Not My Slides")

https://www.youtube.com/watch?v=8ERGlHijaKc

Pushing Left - How We're All Doing It Wrong by Glenn Pegden & Stephan Steglich

YouTube
Good points, @insiderphd. Thank you for bringing this up. If there is no incentive—or liability—very few businesses would create such a culture out of morality or sense of responsibility, unfortunately. How can we help businesses do it? Other than governments imposing regulations or insurance agencies refusing coverage for not having at least some sort of basic security development lifecycle program in place?
@khalid This is a great question that I think not enough people are asking, we know what bad security is, but how do we empower good security, let me know if you solve it ;)

@insiderphd @khalid

I have a 100% effective security measure, but I can't share it because it would no longer be 100% effective.

@Pjcoyle @insiderphd Patrick, hmmm.. that's not going to be sustainable. Not knowing the details, I don't want to comment much, but in general security-by-obscurity (which your measure may not be) is not a good strategy.
@khalid @insiderphd Okay, I'll describe my security. The Computer is disconnected from everything, including keyboard and screen. All of the ports are sealed with resin. It is stored in a welded faraday cage and encased in concrete. Finally, it is buried at a secret location. Even I can't access the information. (SIGH)

@Pjcoyle @insiderphd
😂​ Good one Patrick! Was a little skeptical but was secretly hoping you did have some sort of an innovative solution.

Seriously though, it does not have to be this extreme--nor this perfect of a security. Products with security designed in will make big enough of a difference in the longer term. Deliberate detail to security during design and development is the way. However, we are back to my original question: What would make product manufacturers and software suppliers to take this on?

@khalid @insiderphd the full application of product liability laws, starting by disallowing of EULA.
@insiderphd IMO it goes back to the “if it ain’t broke, don’t fix it” culture where the business owns the backlog and sign off on what goes in. It’s about the false equivalency between the hour of work where the business wants the dev team to only work on high value items (“new features”) and completely ignore low value items (“patch upgrades to runtimes”).