Yeah, I got my first office job thanks to Y2K. An enormous amount was spent fixing it, with some of the fixes needed years before 2000-01-01, for example systems that projected into the future
Biggest problem I saw was a program that stored 1999 as 99 and displayed “19”.year
So when set to January 2000 it showed 19100. Its calculations were fine, just its display and reports were wrong
Agreed, most of the actual problems seemed to be in reporting. I saw some cobol stuff that went to 1900. There were a few things where 00 wasn’t an option, But mostly it was just really heinously written stuff that wasn’t expected to be in service even in the '90s.
I think we’re mostly using 64 bit machines now. Even loads of embedded stuff is running on 64 bit processors now.
There will still be a lot of old software and hardware that needs updating before the 32 bit Unix time overflow
Hopefully it’s another Y2K nothing burger. (Which was largely because a lot of people prepared for it)
Yeah, I got my first office job thanks to Y2K. An enormous amount was spent fixing it, with some of the fixes needed years before 2000-01-01, for example systems that projected into the future
Biggest problem I saw was a program that stored 1999 as 99 and displayed “19”.year
So when set to January 2000 it showed 19100. Its calculations were fine, just its display and reports were wrong
Agreed, most of the actual problems seemed to be in reporting. I saw some cobol stuff that went to 1900. There were a few things where 00 wasn’t an option, But mostly it was just really heinously written stuff that wasn’t expected to be in service even in the '90s.