Pop culture obsessives writing for the pop culture obsessed.
Pop culture obsessives writing for the pop culture obsessed.

The Y2K bug ended up being the millennium’s biggest anticlimax

Illustration for article titled The Y2K bug ended up being the millennium’s biggest anticlimax
Graphic: Lawrence Lawry/Stockbyte (Getty Images)
Wiki WormholeWe explore some of Wikipedia’s oddities in our 5,664,405-week series, Wiki Wormhole.

We explore some of Wikipedia’s oddities in our 6,138,864-week series, Wiki Wormhole.

Advertisement

This week’s entry: Y2K Problem

What it’s about: The weak sauce that passed for a global catastrophe 20 years ago. Like people’s checkbooks (ask your grandparents), computers tended to assume that every year started with “19” and only bothered filling in the last two numbers. Which meant that when the clock struck 2000, every computer would glitch, destroying the electrical grid, sending planes plummeting from the sky, destroying the monetary system… or causing some minor headaches for filing systems, depending on who you believed. Of course, none of that happened, largely because people saw the problem coming, reacted intelligently and responsibly, and fixed the problem before it could spiral out of control. It was a simpler time.

Advertisement

Biggest controversy: We avoided disaster in 2000 because of prudent planning and responsible action, or maybe those things are a waste of time. It’s difficult for people to view planning for a disaster that didn’t happen as worthwhile, even if the disaster didn’t happen because of all the planning. The world spent $300B on updating software to be Y2K compliant, and the majority of tech industry and government observers pointed to the lack of disaster as proof of success. But naysayers, fond as they are of saying nay, pointed out that countries that did almost nothing to avoid Y2K issues, and institutions that didn’t have the resources to spend (like small businesses and public schools), still had very few problems. There were also very few problems in advance of 2000, when people in the late ’90s still would have had to put dates a few years ahead into a computer occasionally. (Although the counterargument to that is that many Y2K compliance efforts began years in advance.)

Strangest fact: The Deep Impact space probe didn’t launch until 2005, but it still had Y2K issues. The probe was launched by NASA with the intention of releasing an “impactor” into comet Tempel 1, in order to study the debris caused by the impact. (It did so in July of that same year, and scientists learn that when you punch a comet, it kicks up a large, bright dust cloud that obscures your view and hampers your research efforts.) That being done, Impact was due to continue on to do more comet and asteroid flybys. But on August 11, 2013, NASA lost communication with the probe. It turned out that its internal clock was set to start on New Year’s Day 2000, and had 32 bits of data set aside to keep track of the time. Once the clock reached 2 tenths of a second to the 32nd power—or 13 years, 7 months, 11 days—it effectively reached the end of time (as far as the probe’s computer was concerned), and malfunctioned.

Advertisement

Thing we were happiest to learn: We were prepared for Y2K because there had been several previous Y2K technology problems. One of the first operating systems was Decsystem 10, which used only 12 bits of memory to store the date—that small amount of storage meant the clocks ran out in 1975, and widespread problems resulted that year. There was also a looming problem on 9/9/99, as it was common practice to type in “9999” as code for an unknown date. This wouldn’t cause any havoc with the computers themselves, as “9999” was simply a shorthanded invented by people using those computers, but it would have caused widespread confusion if left unaddressed. There’s also a future Y2K problem, as Unix—the operating system that underpins most of modern computing—uses a 32-bit clock, which started on January 1, 1970, and will run down on January 19, 2038. A suggested fix is a 64-bit clock, which will run down in the year 292,278,994.

Thing we were unhappiest to learn: Of course, the conspiracy nuts had a field day. Wikipedia breaks out an entire section on “Fringe group responses,” which covers everything from religious fundamentalists to survivalists to cults to the good old end-of-the-world-is-nigh crowd. Leading voices in the survivalist/prepper community played up the threat of a worldwide collapse. Religious fundamentalists, especially ones already predisposed to end-times theology, made apocalyptic predictions. Even a mainstream fundamentalist like Jerry Falwell predicted that Y2K “might” lead to the rapture, and advised his followers to stock up on food and guns. Christian ministries across North America were making huge profits selling prep kits, survival guides, and the like. Deseret News reporter Betsy Hart observed, “preaching chaos is profitable and calm doesn’t sell many tapes or books,” and the Baltimore Sun later noted that “not one of these prophets of doom has ever apologized for their scare-mongering tactics.”

Advertisement

Best link to elsewhere on Wikipedia: Y2K wasn’t just an underwhelming historical event, it was an underwhelming movie! Y2K was a 1999 made-for-TV movie in which Ken Olin is a computer systems analyst (a perfectly cromulent action hero job in the late ’90s) who, after a Swedish nuclear plant melts down when New Year’s strikes in that time zone, has a matter of hours to save the Seattle nuclear plant where he works. Both the film’s realism and quality were savaged by critics, and speaking of savage, Wikipedia lists the disasters portrayed in the film as “power failures crippling the entire Eastern seaboard, computers unlocking all doors in a Texas prison, and Jay Leno continuing to broadcast.”

There was also a rival straight-to-video movie, also called Y2K, that somehow managed to land Louis Gossett Jr., Sarah Chalke, and Malcolm McDowell, but there’s no Wikipedia page, so we’ll just have to hope How Did This Get Made gets around to it.

Advertisement

Further Down the Wormhole: Another technological milestone that’s worrying enough that Wikipedia categorizes it under “global catastrophic risks” is the singularity, a moment when artificial intelligence surpasses human intelligence. Actually determining machine intelligence has proven to be extremely difficult. Wikipedia quotes a Scientific American article by Gary Marcus, in which he points out that “virtually every sentence [that people generate] is ambiguous, often in multiple ways,” so that we can intuit what each pronoun implies in “she talked to her,” but machines can’t. (There’s an explanation in here somewhere as to why Data can’t use contractions.)

Ambiguity is the condition or quality of being ambiguous, a situation in which a phrase’s meaning is not clear, and can be interpreted in a number of ways. While a sentence can be heard or read correctly and still be misunderstood, there’s another layer of ambiguity when spoken language is misheard. Linguists use the term mondegreen for a mishearing that changes the meaning of a sentence. (It comes from someone mishearing the line “layd him on the green” in Scottish ballad “The Bonny Earl Of Murray” as “Lady Mondegreen.”

Advertisement

English-language music has always been rife with such misunderstandings, from “I can see clearly now, Lorraine is gone” to “There’s a bathroom on the right.” But there’s even more room for misunderstanding when lyrics are in a language foreign to the listener. The Moldovan song “Dragostea Din Tei”’s chorus, “Vrei să pleci dar nu mă, nu mă iei…,” translates to “You want to leave but you don’t want, don’t want to take me.” But to Japanese audiences, it sounds very similar to “米さ!米酒だろう!飲ま飲まイェイ,” which translates to, “Rice, obviously! Rice wine, most likely! Drink drink yay!” Because Americans rarely listen to foreign-language music, this phenomenon—called soramimi—is largely unknown here, but is widely celebrated elsewhere in the world. We’ll investigate next week, now ’scuse me while I kiss this guy.

Author of five books, including Selfdestructible, his first novel. He tells people he lives in New York, but he really lives in New Jersey.

Share This Story

Get our newsletter