I've read a lot online about how "Y2K was overblown". As an engineer who worked like crazy in 1999 to correct our systems this drives me crazy.

Today I read about games going offline, watches getting frozen and airline tickets showing the wrong date.

Because of a leap year. Which happens every FOUR years (give or take).

Yeah Y2K was a one time thing. (Well at least until year 10000)

Isn't 2038 going to be fun.

@damieng @mos_8502 There’s no glory in prevention.
@luhrman @damieng @mos_8502

We spent 6 months upgrading software and applying patches and then when Y2K came it was no big deal. I call that a win.
@luhrman @damieng @mos_8502 I want to learn more about how we change this. After all, we make glory.
@damieng note to self - retire before 2038
@kevindente
I'm planning to be retired then, but I'm gonna visit my company's ops room and bring popcorn.
@damieng
@itnomad @kevindente @damieng would say bring a popcorn maker… just not WiFi connected.
@kevindente @damieng note to self: don’t be on a plane

@jonathankoren Or below one. Or sufficiently close to any plane's probable trajectory. Especially Boeing planes.

@kevindente @damieng

@wonka @jonathankoren @kevindente Feels like good advice for current times.
@kevindente @damieng I plan to make up my entire pension in the year 2038
@kevindente @damieng No way! I'm earning all that consulting gold!

@damieng

You have my sympathy. I get incredibly annoyed with fuckwits who claim Y2K was a big nothingburger, after many many people worked their asses off to make sure it didn't crash civilization as we know it. I'm like "those bridges you drove over today, did they fall out from underneath you? No? Oh, does that mean the engineers overspecced them?"

@damieng i can still remember my parent's demanding that i prep the computer we had at the time for Y2K. Spent many any hour researching and installing patches. Looking back that was pretty much the start of my career in IT
@damieng Most of the 64 bit systems will be fine, it's all the older 32 but stuff that will still be driving and controlling infrastructure that we need to worry about ...

@rhempel @damieng

I fear it might even be worse than that.

Even on 64 bit systems, all you need is one variable unexpectedly being a 32 bit type in one line of code and *boom*. I suspect the problems will be harder to see and we’ll end up with more bugs because of it.

@dschaub @rhempel @damieng

I have replaced "int" in most of my C code by "int32_t" or "int64_t".

That does not avoid the problem; but, when it happens, it will be because of my stupidity, not because of my laziness.

@JorgeStolfi @dschaub @rhempel @damieng As a C nerd I am disappointed in compiler vendors not having made `int` be 64 bits when targeting 64 bit processors.
@mlp @JorgeStolfi @dschaub @damieng As an embedded C nerd, and having lived through int being 16, 32, or 64+ bits, I never depend on the size of an int :-)

@rhempel I'm more annoyed we use data types that are for the computers convenience rather than the humans and systems themselves.

COBOL got this right FWIW.

@damieng @rhempel COBOL is where people stored just '99' as two-char string, leadig to Y2k issues, though.

@mirabilos @rhempel But it was plainly obvious that's what it was doing. Dates were a bit weak but numbers and strings were great.

PIC S999,999,999V99 clearly indicates what the maximum and minimum numbers are :) Additionally you don't get unexpected IEEE floating point math rounding issues.

@damieng @rhempel yeah, floating point is the worst
@damieng @rhempel (or VAX floating point, which is not the same as IEEE floating point… or Turbo Pascal real…)
@mlp @JorgeStolfi @dschaub @rhempel I think there's all sorts of other breakage that comes out of that.

@damieng @JorgeStolfi @dschaub @rhempel Any breakage from the size of `int` changing is a sign of code written with assumptions that were poor in 1989 when the language was first standardised.

Granted this whole thread is about the fallout from poor assumptions, but I'd prefer to assume "surely the code will have been recompiled for a wider target/updated for greater storage by then" than "surely the Standard's explicit allowance for wider targets will have been deliberately ignored".

@JorgeStolfi @damieng @rhempel @dschaub @mlp that would have broken anything using u_int32_t as that type would suddenly be promoted to signed int in arithmetics, plus we need a type in between short and 64-bit.

LP64 was the most sensible choice. LLP64 (as in NT) would not have worked with the vast majority of Unix code that puts those types as long, which was found in both earlier and later code than the code putting them into int.

@dschaub @rhempel @damieng

The downside of that change is that the "%d" and "%ld" formats still require "int" and "long int" values. So in order to print an int64_t value I may need one of the other, depending on the platform. I must use some ugly hacks to make my code portable...

The "%f" format can handle single and double floats. Why can't "%d" handle ints of ANY size?

@rhempel @damieng
Maybe. It depends how time_t is represented in every bit of software you run.
@damieng Have been impacted directly by Azure 2012 Leap Year mashup. Had to work more than 24 hours in a row just to see that I don't have impact on that issue. Still sorta have trauma about that.
People like to ask "what you would do if you would be world dictator?". And when it was popular to wonder on that, I decided that I would enforce new time system based on 365 days of 24 hours of 3600.10274 seconds of 1000 ms without leap seconds, days or, months. And without any months. Only UTC time.

@damieng It really is pathetic to see this happen today.

Nothing was seemingly learned from Y2K.

@damieng 2 years of my life were devoted to that, and while the impact of the actual event was minimal - or maybe even disappointing! - it was precisely because of all that work.
@damieng just had to google the 2038 bug. This is gonna be fun.
@EllieATF @damieng a good use case for arc search - browse for me feature : https://search.arc.net/fL54rrtCQOfO5wYbE0qJ
The Year 2038 Problem Explained | Arc Search

Arc Search read websites across the internet to make you this perfect tab.

@marcmagnin Top of that page though "Turn your computer off.." completely misses the point.

You can't patch anything with it turned off and both Y2K and 2038 are still an active issue when you turn it on days later (unlike leap seconds/leap years)

@damieng that's from Best Buy's recommendation back in 1999 , they slapped these stickers on machines back then 😅 while wrong , it makes for great memorabilia!
@damieng I was awoken this morning because of an Oracle error subtracting two years from 2/29. The same code failed in 2020 but that was on a Saturday so it was more easily ignored ¯\_(ツ)_/¯.
@7faces @damieng Was that using ADD_MONTHS() or using INTERVAL arithmetic? I had a problem recently on Teradata where the INTERVAL logic failed due to hitting 29 Feb but ADD_MONTHS worked OK. Go figure. ☹️

@hengymrohebwlad @7faces @damieng It was using interval. A cautionary tale of code that can run for years without issue.

This may exist and I just don't know about it with Oracle, errors like this seem like a good fit for something static analysis could weed out.

@7faces @damieng I worked on Oracle for over 20 years and never used "interval". I also don't recall ever hitting a leap year problem.

Now I work on Teradata, I see lots of interval arithmetic in other people's code. And suddenly I hit a leap year problem.

Coincidence? 🤔

I'll change these intervals to ADD_MONTHS if possible when I stumble across them and see what happens.

What could possibly go wrong? 😱

@damieng
I worry that my Tesla brain chip will reset to 1970 and insist it's time to invest in discotheques and wear bell bottom blue jeans.
@damieng we never get the credit for keeping the lights on.
Only kicked when the lights go out.
@damieng it was genuinely surprisingly to read that "Y2K Compliance" was an actual certificate given from the DOD
@damieng And back in '99 I feel the tech world was pretty united behind the problem, and we got shit fixed. This time around, the disinformation machine is going to make that a hell of a lot harder. We're going to have brainwashed tech CEOs fighting the fixers.
@damieng I was still getting lndustrial Automation software update packages that were not Y2K compatible in March, 1999 - I was like "Does your development team live under a rock or something?" (Later interactions with that company showed that, yes, their entire development team did, indeed, live under a rock or something, and does to this day. Some things never change.)
Sometimes the best words an Engineer can hear are "Well, _that_ was anticlimactic!" - Yes, yes, that was the whole point...

@damieng there's a specific variant of problem (or maybe it's all problems?) that is hamstrung in the public eye because to a layperson, a situation wherein one is correctly prepared for it makes it look like it wasn't a serious problem in the first place.

covid, y2k, climate change come to mind as examples of how the problem manifests with different levels of preparedness. and i genuinely have no idea how to get some people to care about these kinds of issues.

@trenchworms @damieng in my experience it takes a certain level of intelligence and discipline to proactively reason about risk. Most people, unfortunately, can or will only react after the risk has materialized. Sometimes mentioning Noah's ark or some such story will help them to take your word for it and follow along, but don't expect any intelligent feedback. I've concluded focussing on those that show a certain level of discipline and intelligence is more productive.
@trenchworms @damieng in what world was COVID "correctly prepared for"?
@elexia @trenchworms The one I kept wishing we could dimension shift to?

@elexia @damieng

you misunderstand me, i'm saying that if covid were correctly prepared for, you'd have vast swathes of people saying "Why did we go to such extreme measures to deal with something that wasn't even that bad in the first place?"

they are the same category of problem but y2k was actually prepared for, whereas covid and climate change were and are not.

@trenchworms @damieng also very much agree that's exactly what people would have done. I mean people are kinda doing this for COVID anyway, even though we're very much suffering the consequences of it not being adequately dealt with.
@elexia If it didn't effect them or anyone they directly cared about.
@elexia @damieng absolutely, you could see that in many cases communities who were even just barely containing the outbreaks (and, lets be honest, many who weren't at all) were inundated with people using this as evidence that things weren't worth the preventative measures.

@damieng @negativeprimes "Y2K was overblown" <= tell me you know almost nothing about computing without telling me you know almost nothing about computing

The fact that people think it was overblown at all is a testament to the outrageous amount of work people did to prepare for what could have been a very unstable time