Hot take: your desire to run your choice of OS on your work laptop does not trump my desire to ensure the safety of data belonging to our users
Making sure that code is developed on and managed from machines where we have reasonable confidence about the security posture is Good, Actually
@mjg59 But, people in charge of security are never responsible for delays introduced by security leading to people routing around it to hit schedules.
Enforcing security with threats of dire consequences puts people in an unwinnable situation -- either get fired for security violations or get fired for failing to meet deliverables.
@keithp yeah "Security says no" is a toxic (and common) scenario, and all restrictions need to exist for transparent and justifiable reasons, and shouldn't be imposed without discussion with everyone affected. But the flip side of that is that time taken to work with security in finding solutions needs to be factored into project planning - if company culture doesn't allow this, the culture is broken
@mjg59 But security isn't ever accountable for product schedules, so "time to work with security" is an unbounded entry on any product schedule. Far easier to just "encourage" people to route around the blockage. There's always a way to do things insecurely that will be faster than the right way.
@keithp and development is rarely accountable for the risks associated with compromise of user data, even in situations where that potentially leads to actual dead people and existential threat for the company. If developers are encouraged to work around security then company culture is fucked and needs fixing
This seems like a scenario where the very common middle ground is incredibly awful.
"IT is not competent enough to enforce security restrictions" has serious problems, but people can get work done in that environment.
"IT is competent enough to enforce security restrictions and both willing and able to take responsibility for making them reasonable and dealing with any issues that arise" has a different set of potential problems, but can work.
"IT is competent enough to enforce security restrictions but is not willing or able to take responsibility for the consequences of their restrictions" creates a horrible environment that developers should run away from very fast.

People in scenario 1 are in a local maximum, and aren't confident (and often have ample evidence to the contrary) that they'll get to scenario 2 rather than 3, so they fight both 2 and 3 tooth-and-nail. And it seems like very few companies reach 2, far more end up at 3, and the ones in 3 think they're in 2 and blame the developers for not accepting it.

@josh @keithp @mjg59 What kind of users one has is incredibly important, IMO.

One can impose a vast array of restrictions on non-technical employees without significantly interfering with their productivity. Developers, however, cannot do their job without privileged access to their development environments. Applying policies meant for non-technical users to developers is a recipe for disaster.

In my experience, the only reasonable solution is to isolate sensitive data (such as customer data and signing keys) and integrity-critical data (such as source code) from dangerous workloads (such as email and web browsing) via virtualization. The security boundary becomes the entire VM, not the unsecurable workload running within it. Qubes OS makes this much easier.

@alwayscurious @josh @keithp Based on experience, Qubes is too much of a productivity impediment for even security engineers
@mjg59 @josh @keithp Why is that? Personally I consider this a bug, and I imagine the rest of the Qubes devs would also rather this not be the case.
@alwayscurious @josh @keithp this was an exercise that was carried out with the devs - I don't have access to the writeup any more, sadly
@mjg59 @josh @keithp how long ago was it? If you mean the Qubes devs, then hopefully some changes have been made based on that.