so
if github had an outage
and was offline for 48 hours
what would catch fire?
so
if github had an outage
and was offline for 48 hours
what would catch fire?
cuz you know
im just curious
how many companies are you aware of that had "oh fuck someone actually blew up the datacenter" in their threat model or DR recovery efforts?
how many are like "whatevs, our shit is in the cloud and all the contract language has us well protected against lawsuits if shit goes down"?
how many do you think realize that attitude towards technology solves for "the lawsuits after the fact" but does absolutely zero for business continuity?
@Viss I tried to drill into my last group that Disaster Recovery mean you have a USB drive with an OS iso and brand new servers with nothing on them in a location that's no longer your lab.
And the local wiki on another drive with any security certificates and the build/release system needs to be built from scratch with just that.
@Viss Back in the day, I worked somewhere that had a data center approach for almost all the services that could deal with one of the sites abruptly vanishing if it came to it. And one of the biggest problems we had was with software vendors that couldn’t seem to conceive of this and struggled to implement measures we needed to support that.
The company wrote most of its own software, but boy, did we have to have some arguments about the stuff we didn’t write. Had one vendor - who we did not select for purchase after this behaviour! - very snottily tell us ‘you don’t need that’ when we asked about a feature that supported some data redundancy we DID very much need.
@Viss my favourite dr scenario included that the primary dc was under a flight path. and *all* of IT was within yards of that dc.
the COO could not wrap his head around why my planning included non-IT-folks levels of documentation in case, you know, plane+dc interaction.
still they went cloudy since then so it's all safe from harm and outages now /s
@Viss there are also the chaps with access to a US military base who stole four drones from a warehouse. Caught on camera. They’ve disappeared. FBI says “nothing to worry about”.
I’m normally in the “false flag is a conspiracy theory” camp but with these psychopaths in charge these days, at this point anything short of the U.S. launching an ICBM at Sacramento (because Newsom hurt his feelings) has a modicum of plausibility.
Has Amazon invested in counter drone tech or point defence? 🤔🤷♂️
I wonder if there will be in the not to distant future.
nothing i care about? :)
home, none. phone carrier, not much if my basic internet is working. ISP? possible though so far they seem to be doing their revision control mostly in house still. my power company only recently started offering electronic statements, so i'm probably safe there for the moment. ;)
work/customer related shit? probably a much uglier picture.
i've had similar discussions more times than i'd care to about server OS maintenance. the number of under 40 yr olds who don't understand why trying to repro a system related issue when you just spin new virtual servers from the screaming newest externally maintained updated multiple times daily repos might not be a good way to see if it's your code or the OS that is screwing things up.
i have sometimes won the argument of at least doing everything from local repos and not updating the repo without regression testing but not nearly as often as i'd like or turned out to be the correct answer.
automation that depends on someone else's stuff working, not being unreachable, and not being impossible to revert to earlier versions is bound to be a problem at some point.
hell, the idea of regression testing at all and not just wiping and leaping forward on new versions seems to be a dying concept.
self hosting and offline would certainly be my personal recommendation but not all companies or clients are willing to go that route.