Twenty years ago, we were on the cusp of a new millennium. (Yes, I know that there are pedants who assert that the actual start of this millennium was January 1st, 2001, since there was no year “zero.” But I think most people were on the side of the “odometerists.” What they really cared about was watching all those number 9s roll over to 0s.)
In any case, January 1, 2000, came — the calendar turned, and not much happened. Power stations did not suddenly spin up or spin down their turbines, sewers did not back up, planes did not fall out of the sky, billionaires’ bank balances did not suddenly appear in the accounts of the 1 percent. In other words, the world continued pretty much as usual.
There are two possible explanations for this. Some have said that all the concern about Y2K was a trumped-up hoax — perhaps a money-making scheme for out-of-work Cobol programmers (or for people who want to make a quick buck selling silly plush toys like that fuzzy Y2K bug). But others would argue that the reason nothing much happened was because of all the work that was done — we really did avert potential disaster.
What was the Y2K problem, anyway? Well, memory for early computers was very expensive. Rather than use four digits to denote a year, programmers usually just used the last two digits. Which is fine until you cross a century boundary — at which point the person born in 1975 turns -75 on their 25 th birthday. (It gets even more complicated when considering the fact that because of the way computers represent negative numbers, a result that is not expected to be negative is likely to appear as a very large positive number instead.)
Of course, programmers were well aware of this. But the end of the century seemed so far away that no one thought critically about it. They thought that their code would likely be replaced long before it would cause trouble. But as it turned out, a lot of code lasted far longer than anyone expected. “If it ain’t broke, don’t fix it” is generally considered good practice in software design, since even minor changes have the risk of introducing unexpected consequences.
So, let’s take a trip back two decades, to 1999. At the time, I was leading a team of people responsible for IT and security for the computers and networks used by Nortel’s hardware and software designers. The computers we used were running various flavours of Unix (an operating system which very few non-techies had even heard of at the time, but which, these days, most of us are carrying around in our pockets).
As with pretty much all of the rest of the industry, Nortel’s phone system software, written in the 1970s, ‘80s, and ‘90s had two-digit dates, and we had teams of people working on making products Y2K compliant.
By the end of 1999, the work was considered pretty much complete, but of course it would be arrogant to discount Murphy’s Law, ever the bane of engineers. So on December 31 of that year, teams of Nortel software designers stood ready to jump in and make last-minute fixes (since everyone needs to be able to call up their grandma to wish her happy New Year). And so my internal tech-support team needed to be available in case the designers’ workstations had Y2K problems of their own. So I was not drinking champagne on the cusp of the millennium, just in case I had to drive to work and fix some servers.
(As it turns out, we did have a server failure related to Y2K. We had decided to reboot all our servers on January 1, 2000, to give us a chance to fix any issues that might come up before work resumed after the Xmas break. And, in fact, one of our servers failed to come up after reboot. It was getting on in years, and one of its disks did not survive the ordeal of powering down and back up again.)
It has been estimated that the government and private expenditure to fix the Y2K bug was about $100 billion. When the year 2000 rolled around, nothing much happened. Which has resulted in a lot of cynicism about whether it was really a hoax after all. But as skeptics, we can look at the evidence. Though companies were reluctant to provide specifics, it was pretty clear that there was a significant number of potential issues found and fixed. Y2K hype might have fed into the hysteria of “preppers,” but perhaps having a bit of that trickle down to the general public is a good thing. *
People who work in IT and computer security will tell you that it’s one of those jobs where you are either invisible or in trouble. Invisible is better. Though it’s probably a good thing to have a bit of skepticism to keep us on our toes, there is always a balancing act. People are not very good at addressing “invisible or in trouble.”
A topical example is vaccines. As fewer people remember deaths and disabilities caused by so-called normal childhood illnesses, the tendency is to minimize the risk. Now we are starting to see outbreaks of measles, which had been on the verge of eradication. And of course there is the looming existential threat of climate change. It’s no longer quite so invisible as it once was , but it remains to be seen how far we need to get “in trouble” before it is taken sufficiently seriously.
* PSA: Emergency Preparedness Canada recommends that people keep 72 hours’ worth of household supplies — medications, food, water, flashlights, etc.
This article appears in the December 2019 version of Critical Links.