Corrections or additions?
This article by Edward Tenner was published in U.S. 1 Newspaper on
September 29, 1999. Copyright Edward Tenner. All rights reserved.
Y2K — A Crisis of Authority, Not Computers
by Edward Tenner
A former Wilson Center fellow, Edward Tenner is a visitor in the
department of geosciences at Princeton University (E-mail:
email@example.com). He is the author of "Tech Speak" and "Why
Things Bite Back: Technology and the Revenge of Unintended
Consequences. An earlier version of this article was published in the
Wilson Quarterly, autumn 1998.
Seventy years ago, W. I. Thomas and Dorothy Swaine
Thomas proclaimed one of sociology’s most influential ideas: "If
men define situations as real, they are real in their
Their case in point was a prisoner who attacked people he heard
absent-mindedly to themselves. To the deranged inmate, these lip
were curses or insults. No matter that they weren’t; the results were
The Thomas Theorem, as it is called, now has a corollary. In a
society, if machines register a disordered state, they are likely
to create it. For example, if an automatic railroad switching system
mistakenly detects another train stalled on the tracks ahead and halts
the engine, there really will be a train stalled on the tracks.
Today, the corollary threatens billions of lines of computer code
and millions of pieces of hardware. Because they were written with
years encoded as two digits (treating 1999 as 99), many of world’s
software programs and microchips will treat January 1, 2000, as the
first day of the year 1900. Like the insane convict, they will act
on an absurd inference. For purposes of payment, a person with a
age may cease to exist. An elevator or an automobile engine judged
by an embedded microprocessor to be overdue for inspection may be
shut down. All of our vital technological and social systems are
to crippling errors. Correcting programs has required time-consuming
close inspection by skilled programmers, custom solutions for
every computer system, and arduous testing — and time is running
Nobody denies the hazards. And as we will see, if only because of
the original Thomas Theorem, the Year 2000 (Y2K) Problem is already
upon us. The unsettling question is just how serious it will remain
after more billions of dollars are spent between now and then
and testing affected systems — fully 1,898 in the U.S. Department
of Defense alone, and hundreds of thousands of smaller computer
if those of small businesses are included.
Will the first days of the year 2000 be just a spike in the already
substantial baseline of system failures recorded in professional
such as the Risks site on the Internet? That might be called the fine
mess scenario. Or will it be a chain reaction of self-amplifying
— the deluge scenario? Y2K is not the first serious global
problem, but it’s the first truly global one affecting many different
systems — especially mainframe computers that had never been
by viruses. What’s exceptional about it is contagion. It exposes the
weaknesses of the growing national and global interdependence that
had been points of pride.
The problem also coincides with two other major events: the economic
crisis in Asia and elsewhere, which probably has slowed Year 2000
work in many countries despite partial recovery; and the introduction
of the euro in the European Community, a change that has competed
with already scarce programmers’ time.
Because Y2Kology mixes evangelism, prophecy, and entrepreneurship,
its message has not won easy acceptance. Read closely, Y2Kologists
share no consensus on how severe the Y2K dislocations are likely to
be. As of February, 1999, Edward Yardeni, chief economist of the
Morgan Grenfell investment bank, estimated the odds of a recession
at 70 percent: 25 percent for a "modest" one, 40 percent for
a "major" one, and 5 percent for a "depression." But
an acknowledged aim of alarming predictions, as in George Orwell’s
"1984," is to galvanize people into action that will prevent
By September of this year, most experts agreed that crucial utilities
and financial systems were close to full compliance, though they could
not rule out cascading effects from remaining problems. (Some
of nuclear power insist that there are still unprepared nuclear
Covering all contingencies, Edward Yourdon and his daughter Jennifer
Yourdon have written a guide for coping with a variety of plausible
scenarios, which in their view range from a two-to-three-day
to a 10-year depression.
And a few panicky Y2K programmers with no evident survivalist
have retreated to the western deserts — the very area of the
most dependent on electronically controlled federal water distribution
One thing is certain: the apprehension is real, and will have real
consequences. Just as the fear of nuclear war and terrorism has
the world over the last two generations, so the mere possibility of
massive system failure will cast a shadow over its political,
business, and scientific rulers for years to come. Year 2000 is less
a crisis of technology than a crisis of authority.
For at least a century, the West has expected, and received, orderly
technological transitions. Our vital systems have grown faster, safer,
and more flexible due largely to cooperation among engineers, state
legislators, and industries to establish uniform codes and inspection
procedures in place of patchwork regulations and spotty supervision.
Most consumers pay little attention to the hundreds of national and
international standards-setting bodies. Only when major commercial
interests are at stake, as when specifications are established for
high-definition television or for sound and video recording, do the
news media report on debates. Laypeople are rarely present at
Before the early 1980s, many conventions were handled
mainly as internal corporate matters. AT&T established exchange
and area codes, and IBM and a handful of other manufacturers upgraded
operating systems of their mainframe computers.
And why should people worry? The record of these organizations was
unmatched in the world. A Henry Dreyfuss-designed, Western
rotary telephone could work for a generation without repair. Railroads
long ago arrived at standards for compatible air brake systems that
allowed passenger and freight cars to be safely interchanged. And
evolving engineering standards have helped reduce accident levels
on the nation’s interstate highways. The future seemed to be in good
But no comparable effort has been made to cope with the Y2K problem.
The breakup of AT&T, the explosion of utilities competition, the
of manufacturing, and the rise of personal computing have all helped
diffuse authority over standards. And freedom from regulatory
has brought immense benefits to manufacturers, consumers, and the
economy. But it has had an unintended consequence. The diversity of
systems and the fierceness of business rivalries discourage public
and private technological authorities — from the Defense
to Microsoft — from taking firm and early action to cope with
emerging problems. In general, governments have avoided interference
in commercial decisions, and businesses have succeeded more by
market shifts than by staking out ambitious new standards. As the
Thomas Theorem implies, if people do not believe they can exert
then they cannot. Which brings us to "the millennium bug,"
which is no bug at all.
Over the last four decades, the Year 2000 Problem has passed through
three phases, each bringing its own challenges for authorities. The
first age, the Time of Constraint, lasted from the origins of
computing to the early 1980s. The managers and programmers of the
time knew that programs using only two-digit years had limits, but
there was a strong economic case for two.
Leon Kappelman and the consultant Phil Scott have pointed out that
the high price of memory in the decades before personal computing
made early compliance a poor choice. In the early days of computing,
memory was luxury real estate. A megabyte of mainframe hard disk
(usually rented) cost $36 a month in 1972, as compared with 10 cents
in 1996. For typical business applications, using four digits for
dates would have raised storage costs by only one percent, but the
cumulative costs would have been enormous. Kappelman and Scott
that the two-digit approach saved business at least $16-$24 million
(in 1995 dollars) for every 1,000 megabytes of storage it used between
1973 and ’92. The total savings are impossible to calculate, but they
surely dwarf most estimated costs of correcting the Year 2000 problem.
(One leading research group, the International Data Corporation,
a correction cost of $122 billion out of more than $2 trillion in
total information technology spending in the six years from 1995
Even where Year 2000 compliance was feasible and economical, it wasn’t
always in demand. In the 1980s, a number of applications programs
were available with four-digit dates, such as the statistical programs
and other software systems produced by the SAS Institute, one of
most respected corporations. SAS does not appear to have promoted
it competitively as a major feature. The Unix operating system,
developed at Bell Laboratories, does not face a rollover problem until
2038, yet this too did not seem to be a selling point. Even Apple
Computer did not promote its delayed rollover date of 2019. The year
2000 still seemed too far away.
By the mid-1980s, the Time of Choice was beginning. The economic
— initially higher storage and processing costs versus long-term
savings in possible century-end conversion costs — would have
still been an open question, had it been openly raised. The great
majority of crucial government and business applications were still
running on mainframe computers and facing memory shortages. But the
trend to cheaper memory was unmistakable. The introduction of the
IBM PC XT in 1983, with up to 640 kilobytes of random access memory
(RAM) and its then-vast fixed hard drive of 10 megabytes, was already
signaling a new age in information processing.
Yet the possibilities presented by the new age remained an abstraction
to most computer systems managers and corporate and government
Then as now, most of their software expenses went not to create new
code but to repair, enhance, and expand existing custom programs —
what are now called "legacy systems." A date change standard
would initially increase errors, delay vital projects, and above all
inflate budgets. And it was not a propitious time to face this kind
of long-term problem.
The American industrial and commercial landscape during the 1980s
was in the midst of a painful transformation, and investors appeared
to regard most management teams as only as good as their last
results. Only the mortgage industry, working as it did on 30-year
cycles, had recognized the problem (in the 1970s) and begun to work
Not that government was much more prescient. The Federal Information
Processing Standard of the National Institute of Standards and
(NIST) for interchange of information among units of the federal
specified a six-digit (YYMMDD) format in 1968 and did not fully change
to an eight-digit (YYYYMMDD) format until 1996. The Social Security
Administration was the first major agency to begin Year 2000
in 1990. The U.S. Air Force used single-digit dates in some 1970s
programs and had to have them rewritten in 1979. Despite the
military budget increases of the 1980s and the Pentagon’s tradition
of meticulous technical specifications for hardware, many vital
Department systems still require extensive work today.
The computing world of the 1990s recalls a multimedia trade show
decorated at great expense and stocked with the best equipment money
can buy, yet still dependent on a hideous, half-concealed tangle of
cables and power lines, with chunky transformer blocks jutting
from maxed-out surge protectors. Our apparently seamless electronic
systems turn out to be patched together from old and new code in a
variety of programming languages of different vintages. The original
source code has not always survived. Year 2000 projects can turn into
organizational archaeology and confront us with many such example
of engineering coexistence.
During the Time of Choice, the problem was recognized
but deferred for two reasons. First, there was the chance that entire
computer systems would be replaced before 2000. Second, future
tools might reduce conversion costs sharply. Y2K work always has had
opportunity costs. There has always been something that seemed either
more urgent, more profitable, or at least "cooler." (Even
now, I have heard of no Y2K tycoons. Some Y2K consultants have
successfully into other computer advice, but the stock of the public
specialized Y2K firms as a group has declined sharply.) Furthermore,
turnover in executive ranks and expectations of earnings growth,
long-term thinking at the very time that it was called for — in
the mid to late 1980s.
The Time of Choice ended in the early 1990s, when leading computer
industry publications prominently recognized Year 2000 conversion
as a problem and warned of the consequences of neglecting it. It was
followed by the Time of Trial in the mid-1990s, as conversion programs
began in earnest and Y2K issues were increasingly aired in the
press. It will probably last until around 2005.
A few annoyances are already apparent. Credit cards with 2000
dates, for example, have been rejected by some authorization systems.
Y2K optimists are taking heart at the absence of major problems at
some anticipated trouble spots: the beginning of fiscal year 2000
in many organizations over the summer, and more recently the rollover
of older global positioning system (GPS) equipment and the date
once used by programmers as an arbitrary date.
If the coexistence of past, present, and future was the discovery
of the Time of Choice, triage is becoming the watchword of the Time
of Trial. Fortunately, information technologies are not created equal.
Some organizations have hundreds or even thousands of computer
but only a minority, are vital and only a few may be critical. It
is too late to fix everything, even with emergency budgets and the
mobilization of computer-skilled employees from other departments.
As the project management guru Frederick P. Brooks pointed out in
his classic "Mythical Man Month" (1982), adding programmers
to a late project can actually delay it further. In a complex
system, more things can go wrong.
In the Time of Trial, triage will not be the only military metaphor.
Many other information technology projects will be suspended or
as programmers are called up for the front. Careers will be damaged
and entire organizations will be set back. Well-prepared companies
will gain strategic advantages.
Expert opinion now expects a fine mess rather than a deluge. Banks
and investment houses have been advertising their Y2K compliance.
Organizations that feared they would not make the deadline have
new hardware or software or both, outsourced vital functions, or even
merged into larger companies to avoid paralysis. These decisions have
brought problems of their own, like the difficulties of some
running the systems of the human resources vendor PeopleSoft, but
the new glitches are not attributed to date changes. (It is likely,
in fact, that the biggest impact of the Year 2000 episode will not
be date-dependent problems that were missed, but unrelated errors
introduced in the myriad lines of code introduced or changed for Y2K
But some dangers will persist despite the efforts of even the most
resourceful managers. Realization of any one of the five most ominous
threats could validate the doomsayers’ predictions. These risks might
be abbreviated as SMILE: second-order effects, malicious code,
litigation, and embedded processors.
Thomas’s Theorem suggests that the expectation of a Year 2000 crisis
may be enough to create a real one no matter how effective the efforts
to repair the underlying code. Our social and technological systems
are more efficient than ever, but because, for example, information
technologies now allow vendors and manufacturers to maintain lean
warehouse inventories, slight disruptions can have more serious
Running the gamut from shifts of investment funds based on
rumors about Y2K readiness of particular companies, to depletion of
bank and automatic teller machine currency supplies, to runs on bread
and toilet paper, a late 1999 panic might be comical but also
Add potential sabotage to the equation. The Pentagon already worries
about information warfare and terrorism. Hostile states, criminal
organizations, and domestic and foreign radical movements can already
attack vital networks. The beginning of the year 2000 is a perfect
cover. Do not forget embezzlers and vengeful staff. An apparently
Year 2000-related incident could mask electronic robbery, and a
shortage of skilled personnel could delay diagnoses for priceless
months. Computer security experts also fear fly-by-night Y2K
who may collude with corrupt managers to offer bogus certification,
or plant Trojan horse programs in the systems of honest but desperate
Thanks to decades of global thinking, North America
and Europe are also linked to nations whose Year 2000 readiness makes
many Western nations look like paragons. The Asian financial crisis
that began in 1998 has surely delayed the compliance programs of some
major trading partners of the United States and Europe. International
interchange of data may send a failure in one country rippling through
the most rigorously Year 2000-ready systems: the sociologist Charles
Perrow calls this "tight coupling." Major corporations are
already pressing their trading partners for certification of their
Year 2000 compliance. Domestically, this may make or break some firms,
but it will not bring down the economy. Internationally, it may
local crises that might lead to mass migrations or insurrections.
And even if all suppliers can be certified, who will verify compliance
of their subcontractors? In fact, any attempt to make Y2K
universal, down to the last level of sub-vendors, would trigger
correspondence likely to disrupt commerce as much as any actual
The courts have only begun to consider legal liability for Year 2000
failures. The cases already on the docket will test one of the law’s
principles: to decree retroactively but to create predictability.
Because Year 2000 cases will raise new questions and provoke immense
claims, the litigation will be prolonged and possibly ruinous. On
the other hand, recent federal legislation limiting Y2K liability
might prevent businesses and individuals from collecting justifiable
claims, according to consumer advocates and some Y2K experts. Just
because product defects are electronic rather than mechanical or
should plaintiffs’ rights be curtailed just because many firms were
negligent? It remains to be seen whether legislatures and courts can
deflect nuisance suits while protecting users.
The most serious wild card of all, though, is a hardware issue. Most
discussions of the Year 2000 Problem focus on the difficulty of
and testing software, but that is a cinch compared to dealing with
the thousands of embedded microchips that control critical systems.
The Gartner Group estimates that 50 million embedded devices may
Traffic signals and freeway entrance metering lights will fail.
will shut down if their electronic hardware tells them they have not
been inspected for nearly a hundred years. (The largest elevator
deny their products are vulnerable to Y2K failure.) Electric power
distribution switches and pipeline controls will interrupt energy
flow. X-ray machines will not turn on — or far worse, off —
at the proper times.
The Year 2000 Problem shows that neither military nor civilian
neither social democracies nor authoritarian regimes nor market
neither big business nor small business, took fully adequate steps
in planning for the future. If centralized technological planning
is discredited, if the discipline of markets (such as securities
reports and insurance underwriters’ risk assessments) has failed to
give timely warning that cannot be ignored, what is left? Perhaps
it is the realization that technology is not just a radiant future
but a messy present, that the age of transition never ends, and that
rapid novelty, massive legacy, and tightly-coupled systems can
to create lethal assumptions.
Fortunately, crises have nearly always stimulated important and
innovations. They call attention to people and ideas outside the
— in this case, programs in information technology management
as opposed to the older and larger MBA programs and computer science
departments. Tools and concepts emerging from Year 2000 remediation
and testing may well have uses that extend to future conversion
As usual, it’s hard to say what lessons these will be until we learn
January 1, 2000, will not be the first danger point, and it will be
far from the last. The outcome of Y2K will change everything, but
if we already knew what will be changed, there would have been no
Year 2000 crisis, only a problem. Making systematic correction and
recovery easier may be the hardest job of all. But our pace makes
it necessary. As the Red Queen said in "Through the
"Now, here, you see, it takes all the running you can do, to keep
in the same place. If you want to get somewhere else, you must run
at least twice as fast as that!"
Corrections or additions?
This page is published by PrincetonInfo.com
— the web site for U.S. 1 Newspaper in Princeton, New Jersey.