By Catherine Wagley
By Catherine Wagley
By Wendy Gilmartin
By Jennifer Swann
By Claire de Dobay Rifelj
By L.A. Weekly critics
By Catherine Wagley
By Zachary Pincus-Roth
|Photo by Anne Fishbein|
I’m holding a doctoral dissertation titled “Message Delay In Communications Nets with Storage.” The original, not a photocopy, and even though I’ve just been handed it by its author, UCLA Professor of Computer Science Dr. Leonard Kleinrock, I still feel as if I should be wearing white cotton archivist’s gloves. Because what I’m looking at is no less than one of the blueprints for the Internet.
Kleinrock’s modest office at his Internet-gateway company, Nomadix, in Santa Monica, is filled with various plaques, paperweights, lucite blocks and something on a brass stand that looks like an upended suspension bridge, all of which pay tribute to his status as one of the fathers of the Internet. That thing on the brass stand is no less than the Marconi International Fellowship Award, one of the highest honors in telecommunications, presented by the prince of Belgium. In degree of prestige, the Marconi is runner-up to the L.M. Ericsson Prize, the Nobel Prize of telecommunications, presented by the king of Sweden. Kleinrock has one of those, too.
Yet what makes him light up is the Dodgers baseball bat inscribed with his name. After more than three decades in L.A., New York is still in Kleinrock’s voice. And one of the more fun aspects of having the City of Angels realize that it had one of the founders of the Internet in its back yard was Kleinrock’s tribute at a backslapping ritual in Dodger Stadium.
Kleinrock’s work is so fundamental, many of his explanations are followed by the phrase “but they weren’t calling it that yet.” Take that 1962 dissertation for example. It introduces the idea of packet-switching, the underlying technology of the Internet. The term itself would not be coined until 1966 by Donald Davies, a physicist at the British National Physical Laboratory.
Kleinrock came up with his breakthrough because of a phenomenon that is painfully familiar to graduate students everywhere: “The good problems had already been solved, and the ones that were left were hard and mostly unimportant.” While searching for his own topic at MIT, he recalls, “I was surrounded by computers, and I said, ‘One day, these machines have to talk to each other.’” Looking more closely, he observed how his fellow students were working. “I saw that when people sat down at their keyboards, they spent most of their time scratching their heads.”
Scratching your head is fine when computers are plentiful and cheap, but not back in the day when computing resources were precious. Within living memory, although most who went through the experience would rather forget it, computers were enormous, expensive machines shared by many users. Since one job would not take up all the resources of the computer, a system was created that ran many jobs simultaneously, with the computer deciding which one had priority. The idea was called time-sharing, and it inspired Kleinrock to reconsider how resources were allocated across networks. “In time-sharing, you chop things into little bits, and the short messages get through more quickly, the short jobs get through more quickly. That’s what you want with packet-switching,” explains Kleinrock.
With packet-switching, Kleinrock applied the “resource-sharing” concept behind time-sharing to communications. Each message is broken into “packets.” Each packet has a “header” that indicates where it is in the message, rather like the disassembled London Bridge — each piece of the bridge was numbered, so that it could be put back in the proper configuration when it arrived in Lake Havasu. Most importantly, because of the headers it doesn’t matter when or in what order the packets are sent — they will all be reassembled correctly on the receiving end, so you do not need to have a continuously open connection, as with circuit-switching, the technology behind traditional telephone networks.
Granted, this is not the sort of stuff that makes laypeople hyperventilate. But at least the engineering community got it — several years later. “I published it, and nobody cared. Nobody cared. The people who in particular did not care was AT&T. I’d say to the telephone guys, ‘You don’t understand, you want to charge me for a three-minute call, it takes me 25 seconds to dial up the call, and I want to send a millisecond of data.’ And their answer was, ‘Little boy, go away.’ The reason was I didn’t represent a source of revenue, and they were absolutely right — data wasn’t a source of revenue, because there was no data around, there were no data networks to speak of. So in that sense they were right, but in the big sense they were dead wrong.”
Kleinrock’s work would eventually be recognized by Larry Roberts, then the head of computer and communications research at ARPA, the Defense Department’s Advanced Research Projects Agency, formed by President Eisenhower using Sputnik as the excuse for the interdepartmental science agency he had long advocated. Simply out of a desire to solve the problem of how to more efficiently use the resources of highly expensive computers, Kleinrock and the other ARPA researchers created the ARPANET — precursor to today’s Internet.
UCLA was the first node on the Internet, and under Kleinrock’s supervision, graduate student Charlie Kline sent the first message. By comparison, “Watson come here!” sounds like the Iliad. Kline was attempting to log in; he got as far as the L and the O and the system crashed.
And now we come to Internet Origin Myth Number 1: No, it wasn’t designed to survive nuclear attacks. Kleinrock, who participated in the History Channel’s special on the Internet, sighs, “Even the History Channel confused that, unfortunately. I’ll tell you where that comes from: Paul Baran at RAND was studying [the survivability] problem, and he came up with the concept of packet-switching after I did. His report came out in ’64; my work was in ’61-’62.” Kleinrock’s work, not Baran’s, was used as the underlying technology for the ARPANET. Nobody has yet definitely researched how the nuke survival myth made it into popular culture, although there’s a theory it had to do with researchers looking for more funding.
And now for Internet Origin Myth Number 2: Of course, Al Gore wasn’t “the Father of the Internet” — but he doesdeserve credit as its rich uncle. Kleinrock is sympathetic to Gore, and grateful for his role: As a senator, “Al Gore did some great funding of the High Performance Computing and Communications Initiative, which Bush signed as his last act,” Kleinrock says. “Gore was very important in providing significant funding, which launched the gigabit-network initiative and gigabit testbeds; I testified for him and wrote papers for the network. He was very important in that; he just made a slip when he talked about ‘creating the Internet.’”
Internet Origin Myth Number 3 is a grander misconception: that this was all some utopian, academic vision about shared community. ARPA was an engineer’s paradise — nobody had any glorious ideas about changing society. That came later, and not from the technical pioneers. As Kleinrock says, “We didn’t see the community side of it. People had access to computer utilities, but not so much access to each other. It’s only in ’72 when e-mail came on — and then I said, ‘Ahhh, this is about community! Not about machines talking to each other.’”
But in the year 2000, Kleinrock has concerns beyond community. “The speed of light is too slow,” he says in all seriousness. “Once we got to gigabit networks, we bumped into it. In free-space it takes 15,000 microseconds to cross the U.S. at the speed of light — that’s an eternity when you’re talking about nanosecond speeds. It’s mind-boggling — the speed of light is too slow.”
But the speed of light is less of an issue than the protocols, the software architecture, which is still the same stuff that was written back in the Kennedy administration, with four decades of ad hoc updates pasted on. Now, a ’62 ’Vette may be a beautiful thing. But would you really want to take an SUV head-on in one?
The network that Kleinrock envisioned was supposed to carry text and rudimentary graphics files, not enormous multimedia files of bootlegged copies of Pamela Anderson’s honeymoon video and every film that was turned down by Sundance. Nevertheless, Kleinrock explains, there is hope: The Internet relies on a modular structure. The protocols that govern it are stacked in layers, each layer hooking into the one below and above it, like a column of Lego blocks. And as with Legos, you can pull out one piece, make changes to it and put it back without affecting the layers above and below.
But in these days of intense concern over the vulnerability of Web sites, Kleinrock offers this historical caveat: “Security was not a high-priority item — it’s never been. To patch security on later is hard, you can’t start over again.” The problem is that although the modular structure allows for changes, you have to roll out all of those changes over the entire Internet, and as Kleinrock warns, “There’s already 100 million machines out there.”
So the hardware will increase, and the protocols will survive, but what about the culture? To hear Kleinrock describe the early days, before extortionistic denial-of-service attacks and land-grabbing patent claims, is almost painful. It’s engineering before The Fall. “In my design, I had the notion of an egalitarian network with distributed control. Deep down in the bowels of the technology you have that idea, and then it also came out at the cultural level. Ideas were freely shared, nothing was possessed or owned or proprietary. It was a wonderful community of people who trusted each other.”
There is one place these ideals still thrive — in the open-source movement that produced the Linux operating system. In his influential book The Cathedral and the Bazaar, programmer and open-source theorist Eric S.Raymond writes, “Open-source software is not a new idea; its beginnings go back to the beginnings of the Internet 30 years ago.”
Of course, not everybody is out there hacking Linux. My father and Dr. Kleinrock are the same age. Dad is gamely trying to master the Internet (“Karen, this stuff is a damn pain in the ass!”), while Kleinrock, 40 years on, is still working on ways to make that cry less frequent.
His latest project is Nomadix, a company that makes a thing called a Universal Subscriber Gateway (USG) — what the computer industry likes to call “a complete solution” — a box that combines both the hardware and the software necessary to eliminate what’s becoming one of the major nightmares of business life: incompatibility. Just try to get the laptop you use at the office to work with your home ISP, your hotel’s ISP — or even your own company’s corporate network when you visit another branch.
“The trouble is, your laptop is an alien when it is transported to other networks, and it won’t be accepted. You can try to adjust all the parameters yourself, which is craziness, or you could load some special software in your machine, which is also a pain in the neck,” says Kleinrock. The USG takes care of all of that. In a sense, this is an extension of the same question Kleinrock was addressing nearly 40 years ago: how to get computers to talk to each other. But the question then was connecting enormous, stationary computers; now we’re in the world of what Kleinrock calls “nomadic computing.” “People are no longer tied to their desktop machines, they’ve become nomads, and they need connectivity wherever they go. With a wireless PDA [Personal Digital Assistant], I’m going from network to network — am I going to change the I.P. address every time?”
Nomadix’s work is laying the foundation for what Kleinrock sees as the next phase of computing: “I.P. services everywhere, as well as connectivity — my glasses, my shoes, my belt, everything would be on a bodynet. As I approach an idle computer, it will take on my profile, my applications. The Internet will be everywhere — always on, always accessible, but most of all, invisible. Just like electricity, it’s there, you never think about it, you depend upon it and use it.”
If that sounds familiar, it’s because “ubiquitous computing” has been a trendy idea for several years, but Kleinrock foresaw it decades before there were “wearable computer” fashion shows. On July 3, 1969, UCLA put out a press release about what one of its young professors was up to: “‘As of now, computer networks are still in their infancy,’ says Dr. Kleinrock, ‘but as they grow up and become more sophisticated, we will probably see the spread of computer utilities, which, like present electric and telephone utilities, will serve individual homes and offices across the country.’”
Looking back, Kleinrock says, “I saw the ubiquity, I saw the accessibility, I saw the invisibility. But I didn’t see my 92-year-old mother on the network.” I agree, and then indulge in a bit of nostalgia for the paradise of arcana the Internet once was. But Kleinrock gently chastises me: “There are those who bemoan and regret that it’s no longer our private network — I totally disagree. I think the fact that it’s reached out to everybody is proof of its success. You don’t want to own penicillin that nobody uses.”
Find everything you're looking for in your city
Find the best happy hour deals in your city
Get today's exclusive deals at savings of anywhere from 50-90%
Check out the hottest list of places and things to do around your city