By Catherine Wagley
By Channing Sargent
By L.A. Weekly critics
By Amanda Lewis
By Catherine Wagley
By Carol Cheh
By Keegan Hamilton
By Bill Raden
|Art by Santiago Uceda|
This program, Truster, doesn't tell you what's on the other side of the end of the universe or give you the answer to "What is the sound of one hand clapping?" But it does claim to be able to sift through the more mundane of life's riddles and set apart those gems of veracity from the standard-issue bullshit.
On the computer screen, Truster looks like the control panel for a game like Doom or Quake. When you first fire it up, the words System Standby glow inside a black box at the top of the screen and a patchwork of lights and graphs sit motionless, waiting obediently to do their job. Click on "Start Test" with your mouse, maneuver your subject near the computer's external microphone (but not in view of the terminal) and begin your interrogation. The screen jumps to life.
Start out interviewing your subject with a few innocuous questions: "How's life?" and "Hot enough for you?" to let the machine get a read on the subject's demeanor. ("Calibrating," the black box informs you as it samples the interviewee's voice and determines his normal speech patterns.) Then the test begins in earnest. A "truth stress graph" traces steadily across the middle of the screen, charting a history of your victim's responses in peaks and dips. Just above, 11 dots light up from green to red at each statement, more green dots signaling more truth, more red revealing lies. And in that black box, after each statement, Truster issues its verdict in bold yellow letters: "False Statement," "Inaccuracy," "Avoidance" and -- once in a while, you hope -- "Truth." The whole time, a band of numbers across the top of the screen constantly quantifies the subject's present volume, excitement, stress and cognitive levels. A spiky sound wave scrolls by, giving a visual rendering of the subject's every word.
"WHO CAN YOU TRUST?" ASKS THE TRUSTER USER guide. Was that car really never wrecked? Was your spouse actually working late last Tuesday? Is the check truly in the mail? If you really need to know, Truster ("Your Personal Truth Verifier") can tell you. Truster is a CVSA, "a computerized voice stress analyzer." Originally developed for an Israeli software company to screen potential terrorists at the Israeli border, it is now sold by Valencia Entertainment to U.S. consumers.
Professional CVSAs have been around for decades, used in place of or in tandem with the polygraph machine by many police forces and government agencies. But you can't bop down to Fry's or CompUSA and pick one up, and unlike Truster, they don't retail for $179.95. Professional CVSAs are run by professionals, interrogators with weeks of training who can decipher the lines and waves and squiggles those units spit out. With Truster all you need is a PC, a phone and a well-developed sense of paranoia. The program does the rest.
Truster works by sensing four distinct sound levels in the human voice, running them through "a highly sophisticated algorithm," and returning a judgment about the veracity of what was said. The theory behind all this is that the human voice produces certain characteristic sound patterns in a normal conversation and very different sound patterns under the stress of telling a lie. These "microtremors," too subtle for the human ear to detect, can only be interpreted by a machine.
But Truster only works if the person you suspect of lying doesn't know he's being tested. That shouldn't pose a problem in a phone interview -- the program comes with an adapter to run your telephone into your computer -- but even with the external microphone, Truster is designed to work only on "a free conversation, with no â predefined purpose," where the need to tell a lie will come as a surprise to the subject. In other words, you must "Truster" your subject behind his back.
Wait a minute, you say, that sounds rather creepy -- in fact, it sounds like it should be illegal. That's what Elizabeth Schroeder says, too. "This is a very frightening intrusion into our privacy," according to Schroeder, the associate director of the Southern California ACLU. "To allow decisions to be made by machines which purport to reveal the truth is Orwellian and very dangerous."
Orwellian and dangerous perhaps. But at the moment, it's not illegal.
PRIVACY LAW IS A FAIRLY MODERN concept. While laws around property and murder have had thousands of years to develop, privacy didn't really become an issue until human beings, clever monkeys that we are, created the tools to invade it. The Fourth Amendment strictly discussed physical search and seizure, because at the time it was drafted, that was the only way to discover information thought to be held private. It wasn't until wiretaps were developed and the telephone had been invented for tapping that the U.S. Supreme Court had to consider the limitations of that language. In Olmstead vs. United States (1928) the court ruled that Olmstead's Fourth Amendment rights had not been violated because wiretaps didn't actually search or seize anything. Six years later, Congress passed the Federal Communications Act, greatly restricting wiretap activities. And this has been the trend in modern privacy law ever since: Technology leaps ahead, and the courts try a case that doesn't fit into the current legislation, Congress enacts new legislation to fit, technology leaps ahead . . .
For now, technology has leaped ahead. To be sure, there are instances where use of this device would be prohibited. The Employee Polygraph Protection Act, for instance, forbids the screening of applicants or the testing of employees by any "polygraph, deceptograph, voice stress analyzer . . . or any other similar device . . ." And certain questions, under fair housing laws and the Americans With Disabilities Act, you simply cannot ask. But what if Truster tells your future landlord you actually have smoked pot? Don't bother calling your attorney; she can't help you. Unless your landlord causes you public embar-
rassment with the information Truster revealed, your right to lie does not enjoy absolute protection. "There are certain things about someone's private life that aren't anybody's business," insists Schroeder, and on a moral or ethical level, she may be right. But legally, many details of one's personal life reside in the public domain.
Part of what protects Truster at the moment is that it is "designed to be used by a party to a conversation . . ." Because you are speaking with the person testing you and the conversation is not recorded, Truster does not fall under the restrictions of phone tapping or illegal surveillance. But USC professor of law Erwin Chemerinsky doesn't think this protection will last. He concedes that you are giving certain permissions in the context of a conversation, but he doesn't believe that does away with all your rights. "I don't think that you're consenting implicitly to somebody using a machine any more than you're consenting to being recorded." Chemerinsky projects that the continued intrusion that technology like Truster facilitates will eventually force legislators' hands. He echoes Schroeder's contention that certain parts of our lives are just private. Borrowing a phrase made famous in Judge Brandeis' dissent in the Olmstead case, Chemerinsky says, "in terms of the right to be let alone . . . this really is saying, 'we have the right to be let alone by machines, that machines shouldn't be evaluating us without our consent.'"
Legal matters aside, there is some doubt as to whether the Truster is all that good at evaluating anybody. The American Polygraph Association says that "there is no independent research . . . that voice stress analysis is an accurate means of detecting deception" -- precisely what the Supreme Court said in March 1998 when it denied a defendant's motion to admit polygraph results as evidence (United States vs. Scheffer). But Truster's makers claim that "controlled testing has found Truster's degree of credibility to be extremely high."
Faced with these divergent views, I decided to find out for myself. I installed Truster on my computer, hooked the adapter into my phone and left it running continually for a week. Whenever anyone called, I clicked "Start Test," gathered a few voice samples so the machine could calibrate to the caller and started my interrogation. I Trustered my mother, my brother-in-law and some guy selling long-distance service.
With its sound waves scrolling, stress graphs charting and the summary judgment flashing "Inaccuracy," "Truth," "False Statement," I have to admit Truster looked as if it were revealing something. What, exactly, I'm not sure. I found it difficult to ask questions, watch the screen for the system's responses and not sound like Perry Mason grilling a murder suspect on the witness stand. Eventually, the caller would get suspicious, or I would lose track of which response matched which question, and the whole thing would end up hopelessly muddled. Evidently, I'm not that good at lying myself; it was hard to resist letting my subjects in on my secret activity.
"Results produced by the system may be wrong if an unskilled operator performs the test," warns the Truster manual, and it's right. More to the point -- and what the user guide fails to warn against -- is that the operator has to have something he wants to know. "If you have doubts about a certain person," the manual instructs, "Truster can help you confirm your suspicions." I have doubts about everyone, but I don't really want my suspicions confirmed.
What the Truster told me about my friends, relatives and professional contacts was what I already knew: They're occasionally untruthful, exaggerate at times and are often unsure. And I've decided that's okay, because it gives me license to be the same way.