Forget Jonathan Franzen's commencement speech to Kenyon College graduates this summer, in which he lambasted “liking,” (as defined by Facebook) and told students to switch off their electronics, engage with the world, and love instead. It's a compelling message, but represents one more iteration of an argument we've been hearing for a while: technology is bad for you, and the internet is screwing up your kids.

Cathy Davidson, an English professor and Vice Provost of Interdisciplinary Studies at Duke, has sought to explode this myth with her new book Now You See It, which examines the impact of technology on our attention spans and how this affects our capacity to learn and work. She'll be speaking on the topic at USC on September 8 as part of their “Visions and Voices” lecture series.

Now You See It, published by Viking on August 18, 2011; Credit: Courtesy of Cathy Davidson

Now You See It, published by Viking on August 18, 2011; Credit: Courtesy of Cathy Davidson

While the New York Times continually publishes articles on the detrimental effects of technology for our concentration, Davidson takes the technological bull by the horns and argues that concentration hasn't gone downhill with the internet: we're just operating with an outdated notion of attention, in the workplace and at home. She's been critiqued for glossing over the neurological concepts that have made this subject so murky to date, and some think while technology is pushing us to do many things at once, people's brains aren't able to comply. But at the end of the day, Davidson's claim that mono-tasking (the idea that a person can focus on one single task at hand) is an unrealistic model of how the brain works, seems strikingly persuasive.

Davidson also calls for a reform in education, suggesting ways in which technology can be incorporated into the classroom and help kids become multitasking, problem-solving thinkers. If kids' brains are being rewired by the internet anyway, why not make something useful of it instead of lamenting their short attention spans? Now You See It tackles these issues with a radically optimistic stance, and whether or not Davidson's neurological claims will still hold up in several years, it's good to have an author chart out a positive course for our lives with technology. LA Weekly spoke to Davidson, and asked her a few questions about the future of education and technology.

It seems like the impact of technology on the brain has been a long-standing interest of yours, starting with your 2003 experiment at Duke where you gave free iPods to incoming freshmen. Could you talk about how you developed this interest, and how it led to Now You See It?

I developed it in several ways at once. The first one was in the field I'm probably most known for: the impact of mass printing on American democracy and the public schools in the 19th century. The popular forum that Thomas Jefferson hated and George Washington hated was the novel. It was the video game of the 19th century. It was criticized for making children asocial, causing them to take an early interest in sex, and distracting them from good, sound words by the preacher.

When video games and social media started coming around and people began criticizing them, I started thinking, this is very similar to the arguments that had people worried in the last great information age.

I started reading the science. At the time, I was Vice Provost of Interdisciplinary Studies at Duke. I was reading all these dossiers of cognitive neuroscientists, who we were bringing in to run our new center at the time. Added to that was the fact that Melinda Gates was one of my first donors — or goddesses.

How did the book project develop?

I had a strong reaction to what people were saying about technology. It's a horrible thing to say to people: your children's brains are being damaged by this digital age that they're invested in. That's a terrible curse to put on any kid.

I think that if you're 15 years old right now, you've spent your teenage years hearing the following: a. you're going to be poorer than your parents' generation; b. there are no jobs available; c. we're dealing with a world-wide depression and your future doesn't matter; and d. you're ruining your brain by looking at the internet. Well, the last point is simply not true.

Davidson, looking confident about the digital future.; Credit: John Rottet, courtesy of The News and Observer

Davidson, looking confident about the digital future.; Credit: John Rottet, courtesy of The News and Observer

What was the most exciting neuroscientific discovery you read about while researching the book?

On a cellular level, I really like the work being done by the Cambridge scientist [Alexa] Morcom, which says that the brain doesn't have a resting state. [Marcus] Raichle [one of the early pioneers in neuroscience] says that 80 percent of our mental activity is taking up with the brain talking to itself. The brain is very busy when it's not distracted. It uses far less energy when it's multitasking than if it's in a deep, meditative state.

If you had to summarize your main argument in a few sentences, what would it be?

We know from neuroscientific work on attention blindness that in order to focus and pay attention, we have to exclude all kinds of things that are happening around us. We all see selectively. But we don't all select the same things to see. A neuroscientist might say that attention blindness is the limit of the human brain.

But we also live in an era where collaboration is so possible (in particular, online or through crowdsourcing) that we have to see how to get a bigger picture by seeing together. Because of social media and the world wide web, we have possibilities for seeing in new ways, at the same time that our workplace is already taking advantage of those multiple, distributed ways of working.

But we haven't really stopped and said, “Whoa, this is a different way of working together.” Do our schools support this? Our schools were created for the industrial age. Have we rethought this? Have we rethought our workplaces? We're right on time to rethink them — we're 15 years into the commercialization of the internet. We've seen a new generation that doesn't remember the “before.” We're at a point where we have to learn to use it for ourselves, instead of us being a tool for it.

Many of us suffer from attention blindness. One of your main examples of it in Now You See It involves the now-famous “gorilla experiment.” Could you talk us through the experiment, and explain how its results can help us understand how to use our attention?

We've known about attention blindness since the 1970s. In 1999, Daniel Simon and Christopher Chabris made this experiment that you can see on YouTube. In it, the audience is asked to watch a video that is less than two minutes long.

It's six students at Harvard tossing a basketball around. Three are wearing black t-shirts, and three are wearing white t-shirts. People are asked to count the passes between people with only white t-shirts. Then the tester asks, “And who saw the gorilla?” Into this tiny area comes someone in a full gorilla suit, bumps her chest and makes a face at the camera. Under normal testing conditions, over 60 percent of people never see the gorilla.

When you rewind it and see what you've missed, you can't believe it. There are many experiments that show similar results, but the gorilla one is good because it makes us laugh, and then we're more emotionally invested in its results. I saw the gorilla because I'm dyslexic. I watched that tape, and said, I'm not going to count this stupid tape. It was creepy — I was in a room with 200 people and no one saw it.

Do you really believe that the internet has rewired our brains? Or has it just shown us that our brains work differently than we thought?

It's both. Does it rewire brains? Absolutely. Everything does. If you grow up in some small village without electricity, and you play with a ball, and someone else plays with blocks, your brains are being wired differently. We're developing patterns every time we use something. Once something becomes easy, we no longer think of it as a task. Multitasking still strikes us as a task because we haven't made it easy yet

I don't think there's such a thing as monotasking. If a Buddhist monk trying to reach Nirvana in a silent room can't keep his brain quiet… — the brain just doesn't know how to do one thing.

How do you believe that the education system can change to reflect new ways of processing information? How can we deal with the pitfalls of social media (for example, in bullying) or the distractions of the internet in the classroom?

Games are a fabulous way to learn. Things that challenge you, and when you do well, you get to do a bigger challenge, present a much more integrated way of learning. I've seen a very nice online algebra course, which is being used in the Riverside school district. You go in the morning, turn on your computer, and you get an algebra problem. If you solve it, you get a second problem. This second problem either gives you a harder challenge if you got it right, or a remedial question that goes back to an earlier principle if you got the first one wrong.

The great thing about this system is that by the end of the year, you've got incredibly complex data about how these kids are doing. It's about how they did in a year of problem-solving, as opposed to on one test.

That's what I do. I teach a class at Duke called This is Your Brain on the Internet. We have a class blog where students have to post once a week, and give stars or comments to each blog. The standard is really high. My students end up writing more than they do for any other courses where writing is required.

My big revelation was when I made them write a research paper and it sucked. These were the same students who were writing brilliant blog articles, but they left it until the night before, and they knew that I would be the only one reading it, so they fell back on their high school essay skills.

I think that this approach would work in high school and junior high school too — don't let them be negative, just let them articulate what was good about the articles. That's hard enough.

Even though the internet is wonderful, it's also a massive time-suck. What's an efficient way to use the internet? Is there such a thing, or do we have false expectations?

One of the reasons that I want us to stop talking about the bad stuff is that we haven't figured it out yet. My favorite example is email. I go into an office that was made for the 20th century. I close the door to minimize distraction. Then, I turn on the computer. Well, what good is that going to do me? All this information is coming from various parts of my life in a huge mess.

We have to change the punch-clock view of the world. Not only is the whole world coming into our office, but we come home at night, and it's the office coming into our world. We've invented technologies for surveillance — I want technologies saying, Davidson spent 24 hours this week on this project. She should be paid for that. That's why I spent so much time looking at IBM. The people who invented the punchclock are no longer using the punch-clock. They're now doing endeavor-based work, where you're evaluated on the projects you're doing.

What are some ways to implement interdisciplinary studies of brain science you've been drawn to?

Southern California is the hotbed of neuroscience right now. The Damasios [Antonio and Hanna, a husband-and-wife pair of researchers] at USC are doing incredible interdisciplinary work, as well as [Dr. Vilanayur] Ramachandran in San Diego. The brain is everything humans do, and that includes emotions. If you have to choose between emotion and rational thought, go with emotion because it'll motivate you to accomplish something, and Damasio wrote a book about this, called Descartes' Error: Emotion, Reason, and the Human Brain.

I think that interdisciplinary studies is really about attention blindness. I go into it, wanting to be as open and collaborative as possible to open myself up to new worlds of thinking. The exciting thing about the interdisciplinary is that no one knows where it's going to go yet.

Davidson will be speaking at the University of Southern California on September 8 at 7 p.m., as part of their “Visions and Voices” lecture series. See event details here: Now You See It, a lecture by Cathy Davidson.

Follow @LAWeeklyArts on Twitter.

Advertising disclosure: We may receive compensation for some of the links in our stories. Thank you for supporting LA Weekly and our advertisers.