Hot take: your desire to run your choice of OS on your work laptop does not trump my desire to ensure the safety of data belonging to our users
Making sure that code is developed on and managed from machines where we have reasonable confidence about the security posture is Good, Actually
@mjg59 But, people in charge of security are never responsible for delays introduced by security leading to people routing around it to hit schedules.
Enforcing security with threats of dire consequences puts people in an unwinnable situation -- either get fired for security violations or get fired for failing to meet deliverables.
@keithp yeah "Security says no" is a toxic (and common) scenario, and all restrictions need to exist for transparent and justifiable reasons, and shouldn't be imposed without discussion with everyone affected. But the flip side of that is that time taken to work with security in finding solutions needs to be factored into project planning - if company culture doesn't allow this, the culture is broken
@keithp (and I prefer solutions where it's simply impossible to do the insecure thing, so users don't have to worry about violating policies)
@mjg59 But security isn't ever accountable for product schedules, so "time to work with security" is an unbounded entry on any product schedule. Far easier to just "encourage" people to route around the blockage. There's always a way to do things insecurely that will be faster than the right way.
@keithp and development is rarely accountable for the risks associated with compromise of user data, even in situations where that potentially leads to actual dead people and existential threat for the company. If developers are encouraged to work around security then company culture is fucked and needs fixing
This seems like a scenario where the very common middle ground is incredibly awful.
"IT is not competent enough to enforce security restrictions" has serious problems, but people can get work done in that environment.
"IT is competent enough to enforce security restrictions and both willing and able to take responsibility for making them reasonable and dealing with any issues that arise" has a different set of potential problems, but can work.
"IT is competent enough to enforce security restrictions but is not willing or able to take responsibility for the consequences of their restrictions" creates a horrible environment that developers should run away from very fast.

People in scenario 1 are in a local maximum, and aren't confident (and often have ample evidence to the contrary) that they'll get to scenario 2 rather than 3, so they fight both 2 and 3 tooth-and-nail. And it seems like very few companies reach 2, far more end up at 3, and the ones in 3 think they're in 2 and blame the developers for not accepting it.

@josh @keithp @mjg59 What kind of users one has is incredibly important, IMO.

One can impose a vast array of restrictions on non-technical employees without significantly interfering with their productivity. Developers, however, cannot do their job without privileged access to their development environments. Applying policies meant for non-technical users to developers is a recipe for disaster.

In my experience, the only reasonable solution is to isolate sensitive data (such as customer data and signing keys) and integrity-critical data (such as source code) from dangerous workloads (such as email and web browsing) via virtualization. The security boundary becomes the entire VM, not the unsecurable workload running within it. Qubes OS makes this much easier.

@alwayscurious @josh @keithp Based on experience, Qubes is too much of a productivity impediment for even security engineers
@mjg59 @josh @keithp Why is that? Personally I consider this a bug, and I imagine the rest of the Qubes devs would also rather this not be the case.
@alwayscurious @josh @keithp this was an exercise that was carried out with the devs - I don't have access to the writeup any more, sadly
@mjg59 @josh @keithp how long ago was it? If you mean the Qubes devs, then hopefully some changes have been made based on that.
@mjg59 I agree, but I also find it telling that organizations like Let’s Encrypt and Mullivad use Qubes OS, despite its lack of remote attestation.
@mjg59 what if my efficiency diminishes a lot with any OS choice given by the company (or you in this case)?
Should this at least be a reason for you to make it work reasonably safe but not just "take this OS and none other"?
@littlefox if your employer can't support an OS that allows you to work effectively then it's time to find another employer, just as in any other case where employer policies interfere with your ability to do work
@littlefox if the default is just to allow anything then you end up with someone insisting on Windows XP and then their prod creds getting stolen and now you're going to have a bad day
@littlefox And if instead you want IT to support managing an additional range of special-cased OSes then you may improve some developer efficiency but at a significant cost to IT and security efficiency
@mjg59 @littlefox also, I don't know all the tooling out there, but when I was doing infra at my last job I saw what the tools IT was using on Mac and Windows offered on Linux and I can say with full certainty that allowing me to use Linux was allowing me to get away with a vastly inferior level of mandated security. There simply isn't the security ecosystem out there that there is for more common OSes.

@mjg59 it's a hard trade-off between cost and developer efficiency

I get the need to restrict developer (or any user) choice, but also really get the need for "the tools I work with fast", the frustration from not being able to use them

It's hard

Had such a discussion with our CEO some weeks ago, but we didn't find a way to easily resolve this

@littlefox I'm enthusiastic about the goal being to find a way that lets developers work effectively without compromising on security (there's no point in security if nobody can do their job!), but the solution can't be for developers to assert they have the right to choose an arbitrary os
It doesn't help that the mechanisms used to enforce "don't run some obscure Linux/BSD/etc distro and fail to install security patches" or "don't access production user data without massive restrictions" often also effectively enforce one or more of:
- "don't run newer software (that you need to get your job done)"
- "don't have remote access to systems that IT isn't allowed to have remote access to or any control over" (e.g. community/project servers)
- "don't have private keys IT shouldn't have remote access to"
- "don't expect to build quickly or have your tests reliably pass or get sensible results out of performance analysis, because you've got mandatory scanners on your system"
- "don't have experimental test systems" (yes, there *are* IT departments that think new-board bringup systems should have virus scanners on them)
- "don't expect to change system configuration because you don't have root".

In isolation, restrictions to supported/secure/up-to-date distributions are not by themselves inherently bad. And frankly, systems where you can access *production user data* should absolutely be locked down incredibly tight and should not prioritize developer comfort or privacy. But once you go that route, quite a lot of what becomes *possible* to enforce on individual developer workstations can then cause major problems.

There are *better* solutions for all of those things, but that requires IT to recognize all of those things as problems and be required to provide better solutions.
@mjg59 @littlefox Which is usually the right answer, though. Surely the workflow should be "work out what OS your developers need in order to be effective, then hire IT people capable of supporting that OS", and not "work out what OS your existing IT people are capable of supporting, then tell your developers to use it"?
@TalesFromTheArmchair @littlefox If I have 15 different developers wanting to run 15 different BSD varients, the answer isn't to hire enough IT staff to manage 15 different BSD varients
@TalesFromTheArmchair @mjg59 but seems like especially when having a wide variety of jobs (datacenter people, network engineering, ops for different infra parts and toolchains, developers, again for many different toolchains and ecosystems), you'll end up with a wide variety of OS', toolsets and co favored by your people
@mjg59 @littlefox @TalesFromTheArmchair it’s got to be a bit of both. Any engineer who insists it’s my way (tools) or the highway is probably ignoring other rules as well.
@mjg59 @littlefox I know you're referring to software development settings. Other workplaces present different challenges. For me (working in healthcare and primary education settings, with a distant background in corporate infosec), it's common to feel that I'm better able to protect sensitive data than my employer, whose IT systems are often overseen by an overworked operations manager who's struggling to keep the lights and HVAC running, as well as manage a coterie of aging servers and desktops. But except when invited, I rarely deviate from the work systems I'm given, since I don't want the liability I'd assume in going my own way, even if I'd be more efficient (and even if my employer wouldn't care). The employer "owns" their policies and their consequences.

@littlefox @mjg59 Efficiency is, usually, the employer's concern. If my employer is happy to pay me for 10 hours of business-value-delivering work and 30 hours of fighting with OS config, why should I be unhappy to accept?

(If it gets in the way of personal growth and future career prospects, then sure. But very many things can get in the way of that too.)

@geofft @mjg59 because it's extremely frustrating to want to work on actual stuff and not being able too as efficient as I'm used to
@mjg59 it's easier said than done for special startups, the amount of times i've had to support rando ubuntu machines at a msp would make you shake your head..
@mjg59 good luck with this fight. You have my sympathies.
@mjg59 hot take: put the user data back on the users' own storage, return to monke, make good old actual software instead of services
@valpackett not sure that asking users to take responsibility for the security of all their sensitive material is a net improvement
@mjg59 but it's so great from the "blast radius" point of view isn't it? breaking in to a client device is an attack against one person, breaking in to a server is an attack against millions at once, poof, everyone is fucked in an instant – that's way too scary!
@valpackett but it's not breaking into one client device, because nobody wants email that's only accessible from one device. And nobody wants to maintain a server themselves, so it's some sort of appliance. And if there's a vuln in that appliance, everyone using that appliance gets popped. So, do you trust the developers of that appliance to be better at security?

@valpackett @mjg59
That line of reasoning sounds close to "Don't build safety systems into cars or the driving environment; let each driver handle their own safety."

We have centralized safety and security concerns in many domains because we did let people handle it themselves at some point, resulting in disasters.

@mjg59 Completely agree but also unclear why/in what context anyone's work laptop would have access to data belonging to your users.
@dalias @mjg59 given the use of 'our' I suspect this may be calling out a coworker trying to bypass security controls to boot an unauthorized OS on corporate hardware, not a general call-out.

@mjg59 Sure.

And then there are the special cases. Due to something like that, I recently told my employer that I won’t be able to do the work he gives me any longer. That is going to be fun and pissed customer.

@mjg59 No problem, I'll just work somewhere else. (I'm not kidding, it's why I fired my last employer)

Security always say they don't want to be the guys that always say no, ...and then they do. (Really, how about PR review that just screens for the most egregious? Raising the bar to some modest subtlety seems quite cost effective)

If my choice of OS in any way risks customer data, it's already long past time to run for the hills, because security fell down on the job a long, long time ago.

@mjg59 while I agree, there are far too many places for which this means "use Windows or find another job"

@mav @mjg59

It seems like if I, as a dev, have access to our customer's data, then you have already failed.

I should be working of test data, and only have access to spinning up test environments where me running whatever the hell I want can't impact anyone.

@lordbowlich @mjg59 This is also a very valid point, the number of people who have direct access to live customer data really should be extremely minimal