In mid-May, as the Bush administration kicked off yet another “war on drug abuse in America,” White House press secretary Ari Fleischer announced that all 650 White House employees had been required to take drug tests “as a condition of employment.” The president and vice president were first in line. Fleischer would not comment on how this often-humiliating procedure was administered, who administered it, or how closely the president was watched — if he was monitored, as many are, to make sure he delivered his own warm, unadulterated urine directly from the presidential source. Fleischer would not even tell the press whether the president had passed or not. All results, the press secretary said, would “be treated as a private, personnel matter.”


Bush’s drug test is unusual in more ways than one. As common as such tests still are, he will likely be one of the few people in his income bracket required to surrender his excretions on the job. Workplace drug testing, once ubiquitous, may in the future be an indignity reserved for the poor. The most recent trends indicate that while testing rates have dropped, in some cases precipitously, in the white-collar world, they have dropped only slightly in blue-collar settings, and have actually risen among the growing proletariat of the new American economy, retail and wholesale service workers. Athletes aside, the primary subjects of drug tests may soon be prisoners, parolees, denizens of what was once called the working class and, if the state of Michigan prevails in court, welfare recipients. If the Drug War, both at home and abroad, has always been a war against the poor, its testing brigades, long-confused and targeting hapless middle-class professionals, have lately been sharpening their sense of mission.


Drug-testing technology has been around for decades. The military began using it extensively in the 1970s, but it wasn’t until Ronald Reagan whipped the country into a frenzy of anti-drug hysteria that it wormed its way into the civilian world. In 1986, with the goal of creating “drug-free federal workplaces,” Reagan ordered government employees to begin handing over their urine. At or around the same time, riders were added to contracts demanding that anyone who did business with the federal government — whether they be manufacturers, universities, or state and local governments — take steps to create a drug-free work force. Congress formalized this practice into law with the 1988 Drug-Free Workplace Act. Several late-’80s Supreme Court cases confirmed the government’s right to override Fourth Amendment protections against unreasonable searches in the name of a perceived threat to public safety. Sharply dissenting in one such case, Justice Thurgood Marshall warned, “History teaches that grave threats to liberty often come in times of urgency, when constitutional rights seem too extravagant to endure . . . There is no drug exception to the Constitution.”


It was the government, aided by a growing multibillion-dollar drug-testing industry, that continued to lead the charge for testing, barraging private employers with what the ACLU calls a “maelstrom of misinformation” about the supposedly disastrous effects of drug use on worker productivity. Often encouraged by states that gave breaks on workers’-compensation insurance to companies that tested their employees, the private sector patriotically joined the fight and took up arms against the enemy within. By 1996 (when the Personal Responsibility and Work Opportunity Act was passed, ending welfare as we knew it and giving states the right to deny benefits to welfare recipients if they came up positive on drug tests), the vast majority of employers had developed a hearty appetite for workers’ urine. That year, the American Management Association found that 81 percent of major American companies were testing employees, up from 22 percent nine years earlier. Submitting to drug tests had become a standard part of having a job in America.


Throughout the 1990s, though, study after study called the benefits of the practice into doubt. Most notably, an extensive 1996 National Academy of Sciences report found that existing data “do not provide clear evidence of the deleterious effects of drugs other than alcohol on safety and other job-performance factors.” Most drug users, the NAS found, are not addicts, and most do not get high at work (though it might not hurt if they did — cocaine use on the job, the NAS reported, appears to have “slight performance-enhancing effects”). Flatly contradicting the puritanical assumptions underlying the Drug War, the study concluded that “low to moderate use of any illicit drug or alcohol is either positively associated with productivity, or simply not related.”


Other groups, notably the American Civil Liberties Union and NORML, have been arguing for years that even if drug use did affect job performance, drug tests wouldn’t measure the effects. “Drug testing as it is constructed today,” says Alan St. Pierre of NORML, “absolutely, 100 percent does not measure impairment.” Because urinalysis only detects the presence of metabolites, the chemical traces of previously consumed substances, and not the substances themselves, it can, for instance, prove that you smoked a joint on Friday evening but not that you snorted a line of coke on Monday morning immediately before taking the test. On-the-job drug testing, therefore, has more to do with monitoring and controlling an employee’s off-the-job life than with ensuring his or her sobriety while at work.


Even within corporate managerial circles, drug-testing dogma has been called into question. In 1996, after reviewing data compiled since 1987, the American Management Association announced that “No finding of AMA’s nine-year survey efforts can confirm with statistical certainty that testing deters drug use.” Two years later, a survey of 63 high-tech firms confirmed the AMA’s results, revealing that not only had testing failed to improve worker productivity, but that “Surprisingly, companies adopting drug-testing programs are found to exhibit lower levels of productivity than their counterparts that do not,” by as much as 20 percent.


At some point in the late 1990s, the corporate world began listening, but only selectively. The AMA’s most recent data shows that while 74 percent of all companies tested in 1997, 67 percent were testing by 2001. Broken down a bit further, the results get more interesting. Drug testing in the highly paid financial-services sector dropped by more than half, from 47 percent in 1997 to 23 percent this year. Testing in other white-collar fields, in what the AMA terms the “business and professional sector,” dropped by more than 20 percent. Meanwhile, manufacturing, the most highly tested category four years ago, retained that status; testing dropped slightly, from 86 percent to 81 percent. Only in the lowest-paid, least-skilled category — wholesale and retail employees — did drug-testing rates jump, from 61 percent to 65 percent. In short: White-collar workers are being tested less and less, while it’s becoming increasingly likely that waitresses and salesclerks will have to produce a cup of urine on demand.


So Wall Street brokers can rest assured that nothing will get between their bladders and the executive toilets, but stock boys at Vons have to hand over a sample with their job application before performing sensitive tasks like pricing potato chips. “It really doesn’t make sense,” observes Graham Boyd of the ACLU’s Drug Policy Litigation Project. The rift has been especially deep in the tech world, where, the Los Angeles Times reported last fall, two-tier testing programs sometimes exist within one firm. Intuit, for example, screens the workers at its telephone-help center in Nevada for drugs, but not those at its corporate headquarters in Silicon Valley. Similarly, Amazon.com tests its employees at some of its outlying distribution centers, but not at its Seattle home office.


Such disparities, as is so often the case, have largely been determined by the market. At the high end, skilled white-collar workers were enough in demand in the late ’90s that companies feared losing potential employees to other employers who did not drug-test. At the low end, though, the boom years supplied a steady pool of easily replaceable service workers. If they didn’t want to take a drug test at Wal-Mart, there was nowhere else to go — Kmart drug-tests too. Class stereotypes equating poverty with drug abuse (like those behind Michigan’s attempt to make a clean drug test result a precondition to receiving welfare benefits) surely also played a role. Regardless of the causes, though, the result is the same: another ritual of humiliation for the working poor, a clear sign from management that your body, like your time, is not your own.

Advertising disclosure: We may receive compensation for some of the links in our stories. Thank you for supporting LA Weekly and our advertisers.