In a little less than a year and a half, something bad is going to happen.

The world's computers, on which just about everything we eat, earn, spend and save depends, are going to suffer a seizure – or rather a snowballing series of tiny spasms, which in the end may amount to the same thing. The timing, the tempo, the gravity of the impact on you and me are uncertain, but it is very likely that we will find ourselves living “in interesting times.”

The cause of our impending adventures is the Millennium Bug, also known as Y2K. As computer problems go, this one is almost unbelievably simple. Nearly all computers and databases store the year as a two-digit number, like “76” or “98.” It's always been understood that the missing digits are “19.” As of less than two years from now, however, the missing digits won't be “19” anymore. But many computers won't know that. Their internal calendars will flip over from 99 to 100, the first digit will fall away because only two spaces are provided to carry the year, and they'll be back in 1900.

The ramifications of this computational lapse into second childhood are extremely complex. Some of its many untoward effects will be quite unexpected and seemingly unrelated to the date. There are well-informed people who think its effects will be comparable in magnitude to those of a whole string of natural disasters, or a depression, or world war, or even – what the heck! – an asteroid.

REPENT! FLEE!

Most of us are reluctant to suppose that anything comparable to the extinction of the dinosaurs is a possibility, and especially that it could be brought about by something as silly as two-digit date fields. But the hypothetical path that leads from two-digit dates to the collapse of civilization is less circuitous than you might suppose. The most pessimistic scenario looks like this. Data-processing failures lead to delays and errors in billings, interruptions of payments, inability to determine debts and receivables. Amid growing financial disruptions, ill-prepared businesses and banks begin to fail for lack of cash flow, individuals can't get access to their money, securities markets slide as confidence crumbles. Even the manufacturers who have done their Y2K homework find their assembly lines stopping because parts don't arrive from suppliers who haven't. As goods become less universally available, supermarket shelves are emptied by hoarders.

In the meantime, glitches in malfunctioning electronic controllers lead to widespread power failures in the middle of winter, explosions or fires in chemical plants and refineries, effluent spills from sewage-treatment plants, unsafe drinking water, fuel shortages, breakdowns in medical services and communications. Repair teams are overwhelmed. Air traffic slows to a crawl because '80s-vintage traffic-control computers can neither be relied upon nor fixed. Everyday life becomes a minefield of erratic traffic lights, stalled elevators, false alarms, busy signals and missed payments. Government, tormented by as many electronic demons as anyone else, suspends some services and fails to deliver others – including various kinds of entitlement checks. Civil disorders spread. The well-supplied arm and barricade themselves against foraging guerrilla gangs. And so on.

To believe that such a sequence of events is plausible, you must first be willing to accept that the civil institutions of the developed world are not very robust. In some ways, they are not. There are certainly forces in our society that are only barely held in check, and that single events – such as the first Rodney King verdict – can unblock. You must also believe that the modern electronic world, in which we travel, consume, work and play with nearly frictionless efficiency, does not represent the natural state of things, but is rather an elaborate artificial network of delicate, invisible symbioses.

For a quick taste of the gloomiest possible view of Y2K, you can do no better than to visit the Web site of Dr. Gary North at www.garynorth.com. North's Ph.D. is in history: It is an ornament whose generic luster Newt Gingrich has, to be sure, somewhat diminished. North takes an apocalyptic view of the years ahead. He foresees a general collapse of civil institutions, and urges his readers to leave the cities before the brimstone starts to fall. North is sometimes so reminiscent of those bearded characters with REPENT signs in old New Yorker cartoons that it is not surprising to learn that his motives are, in fact, at least partly religious. One would not guess this from his Web site, which appears – despite its survivalist overtones – balanced and well-researched and omits to mention his religious connections, but North is, in fact, a Christian extremist, who in his writings has suggested that “those who refuse to submit publicly to the eternal sanctions of God by submitting to . . . baptism and holy communion, must be denied citizenship, just as they were in ancient Israel.” His Y2K stance can be viewed as a variation on Christ's advice to his prospective disciples to give up what they had and follow him. But being religious doesn't automatically make North wrong about Y2K. Even if you don't share his world-view or his pessimism, his Web site is worth visiting for its huge number of links to other sources, and for its detailed exposition of possible Y2K eventualities.

[

The doomsday view of Y2K could be easily brushed aside if Gary North were its only advocate, or even a typical one. But he isn't. There are many close observers of the impending millennium who are quite pessimistic – not so much so as North, perhaps – and who do not have a dogmatic ax to grind. One is Ed Yardeni, chief financial analyst for the New York securities firm of Deutsche Morgan Grenfell. Yardeni expects a worldwide recession to follow the turn of the millennium; he currently puts the chance of it at 70 percent, and raises the figure almost monthly. In his taxonomy of various likely or possible Y2K mishaps, Yardeni breezily calls the worst an “Ellie,” or “Extinction Level Event.”

THE MEDINA EFFECT

Another moderate pessimist is a computer professional named – for no religious reason – Desmond Nazareth. Nazareth is a Philadelphia Y2K specialist, and has the distinction of being one of the first, if not the very first, to try to make a business out of Y2K repairs.

In 1987, Nazareth was a junior programmer in a large financial institution. Somebody noticed a money leak in the processing of late charges; over a period of six months, roughly about half a million dollars had been unaccountably lost. Nazareth tracked the problem down to a “century overflow condition” in a few lines of code in a 20,000-line mainframe program written 15 or 20 years earlier. One of his proposed solutions was sweeping: to expand the year field from two to four digits in the affected company's databases, and to modify all the software that used them.

“People were highly uninterested,” says Nazareth. “Their reaction was, 'Ridiculous. We couldn't possibly do so much for something as simple as this!'”

After ruminating on the problem for half a year, Nazareth and another programmer formed a company and wrote software to automate system and data repairs. They failed to interest either software publishers or venture capitalists; the payoff was too remote. Managers simply could not believe that such a silly, simple problem could be a serious one. Their response was what Nazareth calls “magical thinking”: “Something will happen to fix it.” It is only in the past three or four years that Nazareth's foresight has finally become the conventional wisdom.

Nazareth compares the data-processing systems of large corporations and financial institutions to the medinas – the ancient native quarters – of Middle Eastern and North African cities. Warrens of passageways, chambers, tunnels and oubliettes that have grown organically over centuries, medinas bewilder strangers and sometimes mystify even their initiates. Large software systems – General Motors, for example, uses about 2 billion to 3 billion lines of program code – are similar: No one knows everything that is going on or could go on inside them, or what all their vulnerabilities and potential interactions with other systems may be. Even after repairs have been made, therefore, it is impossible to be sure that Y2K problems are gone, or even to test completely enough to be certain that the fixes themselves have not introduced new, unexpected side effects.

“The basic problem is the complexity itself,” Nazareth says, noting that even companies that correct their own problems are still vulnerable to breakdowns in their supply chains and interfaces with outer companies' data-processing systems. The interconnectedness of data-processing systems makes determining “Y2K compliance” – ability to handle 21st-century dates – extremely tricky. As Robert Lefkowitz, vice president for desktop compliance with Next Era Consulting Corp., puts it, “[Our business partners] can't be compliant, because we know that in most cases they haven't questioned our compliance. Can any company be compliant if its business partners are not?”

Some large corporations have made efforts to determine the Y2K integrity of their suppliers, but the job is next to impossible. Questionnaires go unreturned, and telephone queries elicit standard responses – “We expect to be ready by December 1998” – that prove, on closer examination, to be hollow. “We are assessing our Y2K status,” for instance, is synonymous with “We haven't done anything yet.” Among those who do respond, the majority have not taken any practical steps to remedy the problem. Recently, the National Federation of Independent Businesses found that 75 percent of small business had done nothing about Y2K. It predicted that 330,000 small businesses would be bankrupted by Y2K and 370,000 would be “temporarily crippled.”

[

How will businesses be affected? In many ways, but the most obvious and clearly understandable examples are in accounting. How much interest is due in January 1900, on a loan made in 1986? Confronted with the command to generate an invoice for payment on a loan to be made in the distant future, what will a computer do? Request a negative payment? Spew out reams of useless garbage? Post an error message on the screen and stop running altogether? And think about payroll: How will a computer process a check for an employee who, according to its records, has not yet been hired? Incredibly, some old software even uses “99” as a code to mean “the end of time,” and might, for instance, automatically order deletion of all files with “99” dates.

These examples may seem more amusing than ominous, but in the globally interconnected world of finance, numbers are everything and time is money. The chief assets of banks, insurance companies and investment firms, beside which their skyscrapers are mere baubles, are their databases. Corrupt the data, destroy the business.

Government agencies throughout Southern California are beginning only now to realize the potential seriousness of the problems they face. At the Los Angeles Unified School District, 48 in-house programmers are working nearly full-time on reviewing some 24 million lines of codes. The possible ramifications of not fixing the problems are myriad. Payroll systems, for example, track the licenses of school police officers and bus drivers. Those whose licenses have expired don't get checks. “There could also be an impact on our retirement system,” says Terryl Hedrich, director of administrative support for LAUSD's Information Technology Division. “For benefit purposes, computers have to calculate how long an employee has been with the district. The computer could generate false data because it doesn't know how to subtract whole numbers from 00.” Hedrich says the schools are also evaluating things such as refrigeration units that may make date comparisons in calculating when to automatically defrost, student records that rely on birth dates to calculate grade placements, and numerous financial systems. She is optimistic that they will finish the fixes with time to spare. But to correct all the problems will cost, an outside consultant has estimated, some $48.5 million.

It's difficult to assess how far local corporations have gotten in assessing and fixing their Y2K problems. Many smaller businesses just assume they won't have problems. As the owner of one computer-animation company put it, “None of what we do on a computer involves dates. If we have to, we'll just put out our payroll manually.” But he acknowledges not having analyzed the company's software to see if it is date-reliant in some hidden way. Other companies give vague but confident assessments. At Mattel, a spokesman said, “a full-time team is working on inventorying computer programs and projects.” The company has been “working for a few years” and “definitely” expects to be compliant. At Parsons Infrastructure and Technology Group, S.G. Anand acknowledges the difficulty of the problem, but says the company is evaluating its embedded chips, computers and software as well as interfaces with outside systems. The big challenge, Anand says, is not the fixing but the testing.

Despite the generally reassuring tone of many Y2K pronouncements, the comptroller of a large locally-based private pension fund undoubtedly spoke for many businesses when, after asking to remain unnamed, he admitted, “We're scared to death. It could destroy us.”

IF IT AIN'T BROKE

It seems incredible that the brilliant technicians who crafted the world's vast computer infrastructure could have been so shortsighted as to build this time bomb into it. How could they have made such a fundamental blunder?

It began in the '60s, when computers filling whole rooms had neither the capacity nor the speed of a single a modern-day PC. Because memory space was scarce and expensive, data-processing systems that handled millions of records had to keep each one as small as possible. A programmer's ingenuity was measured by the number of bytes he was able to pare out of a procedure or a file. In those days, with data processing in its infancy, system design in flux and the turn of the millennium decades away, two-digit years seemed to make good sense.

By the '80s they didn't anymore, but the damage was already done. Data-processing systems had grown reeflike upon older ones, because it was always cheaper to write new code to interface with existing systems than to replicate the existing systems for new types of computers with new capabilities. Two-digit years were locked so firmly into computer systems that even managers who recognized the potential problem – and some programmers were sounding warnings to their supervisors way back in the '70s – could not justify the expense and effort required to correct it. As Desmond Nazareth and many others like him found, the managers of corporate data-processing departments were complacently confident that this gremlin would be outwitted somehow or other, as others always had, and that at any rate they wouldn't still be around to worry about it.

[

It was true that the problem was not, in computational terms, a very difficult one. The shortage of digits could be remedied in various ways. For example, there was “windowing”: program code could be added to systems to capture all dates as they go in and out, deciding in each case, on the basis of common-sense rules, which century they belong to. Or you could write programs to search through other programs and databases, finding their date fields and expanding them. You could even buy time by backdating all transactions by exactly 28 years, since January 1, 1972, is a Saturday in a leap year, just as is January 1, 2000.

But the fundamental problem, as it turned out, was not the two-digit year itself. It was the sheer number of places that it turned up. By the time corporations and governments began to take the glitch seriously, the number of systems affected, the number of instances in which errors were possible, and the potential ramifications of every error because of the interconnectedness of nominally separate data-processing systems had grown to inconceivable size. It was as if it had suddenly been discovered that every 1/4-inch bolt in every structure and machine built since 1970 were made of a material that would turn to mush at the end of the century.

Unlike individual owners of home PCs, each of whom has a private copy of a single public product and can buy or download standard upgrades or patches, businesses and government agencies have tended to develop custom-designed systems and software. The larger the operation and the longer its data-processing history, the more likely it is that documentation of the inner workings of programs have been lost; that the programs, or parts of them, were written in languages now extinct by programmers long retired or deceased; or that they were written in unstructured and undisciplined styles that are hard to decipher and modify today.

Thus, by the middle of this decade every large organization found itself looking at millions of lines of code, some of which did things that no one understood. It would be necessary to search through them for every instance of an operation using a date, modify those to work past the turn of the millennium and then test them exhaustively, since it is axiomatic in programming that any modification may produce unintended side effects. And all this needed to be done while the programs were running day and night, and on the organization's own computer systems, while maintaining an impermeable barrier between the operating and the experimental versions of code and data. It had to be done without contaminating old data with new dates until the entire revised system was deemed ready for service.

Despite the looming obviousness of the problem, even today Y2K remains a hard sell. Managers are reluctant to spend a lot of money – their entire data-processing budget for one or more years – on activities that produce no discernible gain. But there is no longer any choice. And the cost of evaluating, fixing and testing what at first glance looked like a trivial glitch turns out to be unbelievably high – from one to as much as 10 dollars per line of code modified. The Gartner Group, a leading information-technology consulting organization, has estimated the worldwide cost of Y2K remediation at $600 billion – about a tenth of the U.S. GDP for a single year. But estimates are wildly unreliable. The state of California, for example, believes its efforts will cost some $240 million. But the far smaller Los Angeles Unified School District estimates its costs at more than a fifth of that amount, suggesting that one or the other has grossly miscalculated.

One thing is certain however: Large corporations are now routinely budgeting hundreds of millions for repair and testing. Whether they will ultimately spend more or less than expected remains to be seen, but many experts suspect that current estimates of eventual costs are optimistic.

A RACKET

AND A RUSE

It's natural that some of the people involved in Y2K remediation have become prophets of doom: They have a firsthand view of a disaster unfolding. But they are also interested parties. They are making money from the problem and hoping to make more, and the more frightened their potential customers are, the more money they stand to make. (More subtle, but perhaps no less important to would-be Cassandras, is the increment to one's self-esteem that comes from feeling that one's work is saving the world.) So as the drumbeat of ominous warnings accelerates, there has also developed a backlash. Y2K debunkers denounce the alleged problem as a full-employment program for computer nerds, and dismiss warnings of dire consequences as so much millenarian raving. Their attitude was typified by a 1996 article in American Programmer magazine by Nicholas Zvegintzov, a computer scientist and writer. The Year 2000 problem, he wrote, was a “racket and ruse” used by programmers to suck money into their departments. Solving it was, in Zvegintzov's opinion, “an exercise for the software novice.”

[

And he has not changed his opinion. “The truth is that January 1, 2000 will come and go, and the world and its computers will go on working about as well as before, with perhaps a few minor mishaps, and you will find yourself wondering what the big excitement was about,” Zvegintzov said recently. “Just as cockroaches are said to be perfectly adapted to surviving nuclear holocaust, so bureaucracy is perfectly adapted to surviving the Year 2000.” The ruse, if ruse it be, has succeeded mightily. In many organizations, the amount of work that needs to be done to head off the supposedly fictive disaster, and the skills it requires, exceed the capacity of the information-systems department, and so a new industry has been born: Y2K remediation. Small hired-gun consulting firms have sprung up, armed with specialized tools and talents. As the demand grows, newborn offshore firms have rushed in to fill the vacuum: Israel, India and Ireland are leading sources of Y2K expertise, as daytime programming operations in Bangalore repair and run the codes of New York corporations on New York computers, via telephone and satellite, during the New York night. But India, however overpopulated it may be, is not overpopulated with computer programmers; it has only 30,000 of them. Demand for programmers is outrunning the supply, however, and prices are high and are getting higher – not least because this is a line of work that is doomed to vanish in a couple of years. To forestall mass desertions, Bank of America, for one, has set aside $100 million to fund post-2000 bonuses for programmers who don't jump ship.

Few software professionals have embraced Zvegintzov's view. But for those not directly involved in saving the world, reassurance that it really doesn't even need saving is welcome, and to generations accustomed to domestic tranquillity, the idea they are being stalked by a disturbance of global dimensions is hard to accept. So, for the great majority of ordinary people and small- to medium-size businesses and organizations, the Millennium Bug remains something they read something about somewhere, and then forget.

SPEAK NO EVIL

Even organizations that claim to have their problems under control, such as the city of Los Angeles, dislike talking openly about them, ostensibly because they do not want to create needless anxiety in the uninformed citizenry. In the face of this persistent silence, Peter de Jager, the world's leading and most peripatetic deliverer of Y2K warnings to business and government, created the “Damocles Project” to encourage whistle blowers to come forward anonymously with information on Y2K-related situations with public-safety implications. But he recently ended the project and destroyed its files when he received legal advice that his records could be subpoenaed and he might not be able to protect the confidentiality of his sources. Thus one path for inside information from business and industry was cut off.

No one – not government, not business, not even individual employees – has much to gain by announcing costly problems before they occur, and so there is a notable lack of hard scientific evidence on which to build a reasoned case for one Y2K outcome as opposed to another. It remains difficult, and will probably remain difficult right up to the end, to separate rational predictions from dire fantasies, and so estimates of the dimensions of the Y2K threat will always be largely speculative.

The Securities and Exchange Commission now requires publicly held corporations to include Y2K statements in their annual reports. Naturally, most such statements are couched in cautious language, saying, in effect, “We can't know the future, and this is a future risk like other future risks.” Some, however, are slightly more expansive. Hewlett-Packard's statement, filed in March of this year, identifies a number of concerns, including “customer-satisfaction costs” – in other words, the irritation of customers who discover that computer and software manufacturers have for years been knowingly selling them products with a built-in, but undisclosed, expiration date. “Company believes,” the passage says, “that it is not legally responsible for costs incurred by customers to a achieve their Year 2000 compliance.”

[

Legal experts say that computer and software manufacturers probably have no liability, even though when they sold their goods they knew, and their customers probably didn't, that they would no longer work after 1999. Nevertheless, common sense suggests that if companies knowingly sold computers and software that would not run past the end of 1999, and did not warn their customers, they should bear some responsibility. The argument that the expected service life of a computer or a program is only two or three years will not impress many end users. A New York law firm has already, in fact, filed a class-action suit against a Santa Monica software manufacturer, Symantec, for charging customers for Y2K-compliant upgrades to the widely used Norton Utilities, and the state of North Carolina is suing computer manufacturers, under a legal theory similar to that used in tobacco lawsuits, in an attempt to recover an estimated $132 million in remediation costs.

For many businesses, costs in good will, customer support and legal defense could run far beyond the amounts required to achieve compliance. The costs of litigating over due diligence and executive responsibility, over loss or interruption of business, or even over physical damage to persons and property, could be greater still; the cost of losing would be greatest of all. So large is the potential cost, in fact, that many states have passed or are weighing legislation that would make them immune to lawsuits over Y2K-related damages.

It is perhaps a manifestation of some natural law relating harmful events to the litigation that they engender, that one published prediction of the cost of litigation arising from Y2K-related issues are two to three times the estimated repair cost: about $1 trillion.

Don Butte, an executive at Kraft Foods who had responsibility for the company's Y2K program, recently reported that with 300 people working full time on remediation, Kraft itself is “basically on schedule.” But 70 percent of Kraft's thousands of suppliers have not yet started to repair their computer systems, leading Butte to speculate that some smaller suppliers will probably elect to go out of business rather than face the expense of making the necessary revisions. He expected that only 60 percent of Kraft's suppliers can possibly be ready by the year 2000. Kraft is already involved in triage and contingency planning for supply interruptions that are now thought to be inevitable. One option being considered is to stockpile essential raw materials in advance of the millennium – an opportunity for futures traders, if they can just figure out which materials will be in short supply.

With the time required for a large business to thoroughly test its modified software being, by general consent, at least a year, the window of opportunity for getting the fixes done at all is rapidly closing. Companies systematically underestimate the magnitude of the problem and the probability of getting any software project completed on schedule, despite examples such as that of State Farm Insurance, which presciently began tackling Y2K in 1989 and still has 100 employees working full time on the project today.

Some of the widespread failure to act is due to a mistaken belief that the problem is confined to mainframe computers, and that PCs, which handle an increasing proportion of corporate, scientific and government computing, are not affected. In fact, even home PCs and consumer software packages – including Windows 95 – have potential Y2K problems, and some will require physical overhaul to solve them.

F IS FOR FEDERAL

Even if you're not prepared to believe that everything is going to fall apart after that fateful New Year's Day, there are still plenty of lesser prospective disasters to choose from. Besides the widely predicted business and financial failures, with an associated collapse of stock prices and possible global recession, slowdowns or breakdowns of government services are also likely, with the IRS, FAA and Department of Defense among the favorite victims. The federal government admits to facing costs of over $50 billion for Y2K remediation, and at the beginning of June received an F for its collective efforts from the House subcommittee charged with overseeing the effort.

continue

Advertising disclosure: We may receive compensation for some of the links in our stories. Thank you for supporting LA Weekly and our advertisers.