Capitalists needed Darwin to explain how something apparently so cruel could be good for us. Communists needed Marx to explain why something so ineffectual would triumph in the end anyway. When systems reveal their weaknesses, we try to find a theoretical justification to help us maintain our faith through the hard times. So what Steven Johnson, the founder of Feed magazine and, is trying to do in Emergence: The Connected Lives of Ants, Brains, Cities, and Software is timely. Just as the New Economy appears to be heading into its first prolonged period of self-doubt, it has found a philosopher, as well as a bullish prognosticator. His goal is to buck up our faith in Internet culture. To do that he has to domesticate it, humanize it, tie it to ideas and experiences we already have.

“Is the Web learning as well?” he asks at one point in the book. A quick take on the Zeitgeist would suggest that we mostly hope not. We like to tell computers what to do rather than to be told. We want them to have a toehold but not a neckhold. Yet how can one resist in the abstract the idea that the Web could be our meta-consciousness — in Johnson’s term, our “global brain”? Who would not want the power of all our minds united? How churlish. Yet the dot-com bust suggests our independent selves are not to be overruled so easily.

Maybe we‘ve already voted, then. Intuiting our resistance, Johnson takes an oblique approach to convincing us of the necessity of living controlled by lifeless, bleeping machines. He declares his book to be about something more general: “self-emergent systems,” those that organize themselves without any central authority directing them. “They solve problems by drawing on masses of relatively stupid elements, rather than a single, intelligent ’executive branch.‘” Examples are ant colonies, city neighborhoods and the computer game SimCity. In all these activities, semi-self-aware groupthink creates better results than either a top-down management system or pure chance.

The most promising example of the power of “relatively stupid elements” comes from mathematics. In the late ’90s, a programmer named Daniel Hillis wanted to use a supercomputer to solve the puzzle of how you sort 100 numbers in the fewest steps. Deciding to let the computer figure it out itself, he instructed it to write thousands of programs and let them compete with each other. He extracted the best algorithms from the more successful ones until he‘d gotten a program better than he could come up with — it got the algorithm down to 62 steps. “I have carefully examined their instruction sequences, but I do not understand them,” Hillis wrote afterward in his 1998 book, The Pattern on the Stone. “I have no simpler explanation of how the programs work than the instruction sequences themselves. It may be that the programs are not understandable.” Something primitive made something brilliant.

Ant colonies also display emergent adaptive behavior, and they make for the best reading in the book. Johnson dismisses what he calls “the Stalinist ant stereotypes . . . In fact, colonies are the exact opposite of command economies.” I had my doubts — I’d seen Antz — but he is persuasive. There is no Stalin in an ant colony. There is no queen, really. It‘s a misnomer. No ant tells another ant what to do. They decide their behavior by sensing other ants’ behavior through their pheromone trails. Do I forage or nest-build? Do I go on guard duty? It depends on what the others in the colony are doing. Collectively, the ants‘ decision making is a tool as powerful as Hillis’ supercomputer.

Johnson adds a bunch of other self-emergent systems to his argument — slime mold, the goldsmiths of medieval Florence and game-video artists. He gives a nod to the decentralized management technique of some business gurus and the protest groups at meetings on global trade who work without central leadership. And, since a paradigm shift such as emergence needs an intellectual parentage, too, Johnson traces the “emergence of emergence” theory to early studies of city life. His pagan poet, his Virgil, is Friedrich Engels, who wrote a study of working-class Manchester in the 1840s. Engels kept believing that a conspiracy of the powerful, “an unconscious, tacit agreement,” determined that the rich and poor were segregated. Yet he also guessed that this analysis was too simple.

According to Johnson, what was really happening was thousands of local decisions: People like to be near the people they work with; they like to be near the shops that have what they need, etc. Together, they make a neighborhood. Engels couldn‘t see the emergent system for the conspiracy theory he preferred. Johnson contrasts him with Jane Jacobs, who in The Death and Life of the Great American Cities, her 1961 book about, in part, New York’s West Village, proposed that “Vital cities have marvelous innate abilities for understanding, communicating, contriving and inventing what is required to combat their difficulties.” She was the first to see the light.

Emergent systems are “revolutionary” — Johnson says it several times. But to me, the idea seems more evolutionary, in both senses. Emergence emerges as an interesting subclass (to use another of his terms) of Darwin‘s theory of natural selection. It shifts the emphasis from self-interest to something more inclusive: evolution as an act of parallel decision making. Us, not me. It rearranges the gears a bit, but it doesn’t change the central paradigm of the world as a machine that goes on its own. And if the world is a watch, we still need to know the watchmaker. Hillis even had to step in and choose which programs to cross-breed (not to mention that a traditional programmer once solved the problem in fewer steps).

Still, Johnson is quite bullish on the future role emergence will play in our economic and cultural thinking. He asks: “Can th[e] chain be extended in a new direction — both on the atomic scale of digital information and the macroscale of collective movements? Will computers — or networks of computers — become self-aware in the coming years, by drawing upon the adaptive open-endedness of emergent software? Will new political movements or systems explicitly model themselves after the distributed intelligence of the ant colony or the city neighborhood? . . . Is there a genuine global brain in our future, and will we recognize ourselves in it when it arrives?”

The answer comes in two parts. The system, Johnson writes, is powerful enough to do anything — even make the Web think — “but it is both the promise and the peril of swarm logic that . . . you never really know what lies on the other end of a phase transition.” In other words, the future has to emerge, but we don‘t know where or how. That’s a clever hedge. Elsewhere in Emergence, he argues that it has already begun, in Web sites like, where the community rates the postings, and in the form of computer programs at sites like that examine your buying habits and then suggest purchases. You liked John Grisham, try Along Came a Spider.

I know that when discussing computers one has to leave room for future improvement, but it does seem worth mentioning my own experience with software that tries to guess one‘s mind: I bought the Oxford English Dictionary, and recommended Nora Roberts and The Professor and the Madman. Nora Roberts I can’t even begin to explain, but The Professor and the Madman I get: It‘s a book about the writing of the dictionary. The analogy is a simple one. It’s not based on style or genre or world-view. It‘s just a word match.

Think back to the computer programmer Daniel Hillis and his remarkable confession that the program he had bred was smarter than any he could have written himself. The appeal of this confession is its modesty. A great programmer out-programmed. At first it seems value-neutral, but examine it a little closer, and it is in fact highly conservative. I have no idea why it works, but as long as it works, I trust it. I’ll be passive. That‘s the attitude that frightens me — and others, I think — about the wired world. We like looking up listings online, but we don’t necessarily trust the computer to tell us what to see. We want our friends to do that. Or to come up with it ourselves. Or to stay home. I‘m not sure the author cares about this.

That’s my real qualm with Johnson, whose inventiveness I admire. He doesn‘t have enough respect for our uniqueness, for the immense complexity of our consciousness. SimCity, StarLogo, slime mold, ant colonies — his book is curiously empty of people. Unlike the city in Jacobs’ Death and Life, it‘s a techie-seeming place, easy to think about, where the immense human desire to connect, to innovate, to make our own reality, is underrated. Johnson sets out to humanize computers but in the end computerizes humanity. Strange from a man who used such human skills to write this book.

LA Weekly