'The Shallows': This Is Your Brain Online
Try reading a book while doing a crossword puzzle, and that, says author Nicholas Carr, is what you're doing every time you use the Internet.
Carr is the author of the Atlantic article Is Google Making Us Stupid? which he has expanded into a book, The Shallows: What the Internet Is Doing to Our Brains.
Carr believes that the Internet is a medium based on interruption -- and it's changing the way people read and process information. We've come to associate the acquisition of wisdom with deep reading and solitary concentration, and he says there's not much of that to be found online.
Chronic Distraction
Carr started research for The Shallows after he noticed a change in his own ability to concentrate.
"I'd sit down with a book, or a long article," he tells NPR's Robert Siegel, "and after a couple of pages my brain wanted to do what it does when I'm online: check e-mail, click on links, do some Googling, hop from page to page."
This chronic state of distraction "follows us" Carr argues, long after we shut down our computers.
"Neuroscientists and psychologists have discovered that, even as adults, our brains are very plastic," Carr explains. "They're very malleable, they adapt at the cellular level to whatever we happen to be doing. And so the more time we spend surfing, and skimming, and scanning ... the more adept we become at that mode of thinking."
Would You Process This Information Better On Paper?
The book cites many studies that indicate that online reading yields lower comprehension than reading from a printed page. Then again, reading online is a relatively recent phenomenon, and a generation of readers who grow up consuming everything on the screen may simply be more adept at online reading than people who were forced to switch from print.
Still, Carr argues that even if people get better at hopping from page to page, they will still be losing their abilities to employ a "slower, more contemplative mode of thought." He says research shows that as people get better at multitasking, they "become less creative in their thinking."
The idea that the brain is a kind of zero sum game -- that the ability to read incoming text messages is somehow diminishing our ability to read Moby Dick -- is not altogether self-evident. Why can't the mind simply become better at a whole variety of intellectual tasks?
Carr says it really has to do with practice. The reality -- especially for young people -- is that online time is "crowding out" the time that might otherwise be spent in prolonged, focused concentration.
"We're seeing this medium, the medium of the Web, in effect replace the time that we used to spend in different modes of thinking," Carr says.
The Natural State Of Things?
Carr admits he's something of a fatalist when it comes to technology. He views the advent of the Internet as "not just technological progress but a form of human regress."
Human ancestors had to stay alert and shift their attention all the time; cavemen who got too wrapped up in their cave paintings just didn't survive. Carr acknowledges that prolonged, solitary thought is not the natural human state, but rather "an aberration in the great sweep of intellectual history that really just emerged with [the] technology of the printed page."
The Internet, Carr laments, simply returns us to our "natural state of distractedness."
Pundits have been trying to bury the book for a long time. In the early years of the nineteenth century, the burgeoning popularity of newspapers -- well over a hundred were being published in London alone -- led many observers to assume that books were on the verge of obsolescence. How could they compete with the immediacy of the daily broadsheet? "Before this century shall end, journalism will be the whole press -- the whole human thought," declared the French poet and politician Alphonse de Lamartine in 1831. "Thought will spread across the world with the rapidity of light, instantly conceived, instantly written, instantly understood. It will blanket the earth from one pole to the other -- sudden, instantaneous, burning with the fervor of the soul from which it burst forth. This will be the reign of the human word in all its plenitude. Thought will not have time to ripen, to accumulate into the form of a book -- the book will arrive too late. The only book possible from today is a newspaper."
Lamartine was mistaken. At the century's end, books were still around, living happily beside newspapers. But a new threat to their existence had already emerged: Thomas Edison's phonograph. It seemed obvious, at least to the intelligentsia, that people would soon be listening to literature rather than reading it. In an 1889 essay in the Atlantic Monthly, Philip Hubert predicted that "many books and stories may not see the light of print at all; they will go into the hands of their readers, or hearers rather, as phonograms." The phonograph, which at the time could record sounds as well as play them, also "promises to far outstrip the typewriter" as a tool for composing prose, he wrote. That same year, the futurist Edward Bellamy suggested, in a Harper's article, that people would come to read "with the eyes shut." They would carry around a tiny audio player, called an "indispensable," which would contain all their books, newspapers, and magazines. Mothers, wrote Bellamy, would no longer have "to make themselves hoarse telling the children stories on rainy days to keep them out of mischief." The kids would all have their own indispensables.
Five years later, Scribner's Magazine delivered the seeming coup de grace to the codex, publishing an article titled "The End of Books" by Octave Uzanne, an eminent French author and publisher. "What is my view of the destiny of books, my dear friends?" he wrote. "I do not believe (and the progress of electricity and modern mechanism forbids me to believe) that Gutenberg's invention can do otherwise than sooner or later fall into desuetude as a means of current interpretation of our mental products." Printing, a "somewhat antiquated process" that for centuries "has reigned despotically over the mind of man," would be replaced by "phonography," and libraries would be turned into "phonographotecks." We would see a return of "the art of utterance," as narrators took the place of writers. "The ladies," Uzanne concluded, "will no longer say in speaking of a successful author, ‘What a charming writer!' All shuddering with emotion, they will sigh, 'Ah, how this "Teller's" voice thrills you, charms you, moves you.'"
The book survived the phonograph as it had the newspaper. Listening didn't replace reading. Edison's invention came to be used mainly for playing music rather than declaiming poetry and prose. During the twentieth century, book reading would withstand a fresh onslaught of seemingly mortal threats: moviegoing, radio listening, TV viewing. Today, books remain as commonplace as ever, and there's every reason to believe that printed works will continue to be produced and read, in some sizable quantity, for years to come. While physical books may be on the road to obsolescence, the road will almost certainly be a long and winding one. Yet the continued existence of the codex, though it may provide some cheer to bibliophiles, doesn't change the fact that books and book reading, at least as we've defined those things in the past, are in their cultural twilight. As a society, we devote ever less time to reading printed words, and even when we do read them, we do so in the busy shadow of the Internet. "Already," the literary critic George Steiner wrote in 1997, "the silences, the arts of concentration and memorization, the luxuries of time on which ‘high reading' depended are largely disposed." But "these erosions," he continued, "are nearly insignificant compared with the brave new world of the electronic." Fifty years ago, it would have been possible to make the case that we were still in the age of print. Today, it is not.
Some thinkers welcome the eclipse of the book and the literary mind it fostered. In a recent address to a group of teachers, Mark Federman, an education researcher at the University of Toronto, argued that literacy, as we've traditionally understood it, "is now nothing but a quaint notion, an aesthetic form that is as irrelevant to the real questions and issues of pedagogy today as is recited poetry -- clearly not devoid of value, but equally no longer the structuring force of society." The time has come, he said, for teachers and students alike to abandon the "linear, hierarchical" world of the book and enter the Web's "world of ubiquitous connectivity and pervasive proximity" -- a world in which "the greatest skill" involves "discovering emergent meaning among contexts that are continually in flux."
Clay Shirky, a digital-media scholar at New York University, suggested in a 2008 blog post that we shouldn't waste our time mourning the death of deep reading -- it was overrated all along. "No one reads War and Peace," he wrote, singling out Tolstoy's epic as the quintessence of high literary achievement. "It's too long, and not so interesting." People have "increasingly decided that Tolstoy's sacred work isn't actually worth the time it takes to read it." The same goes for Proust's In Search of Lost Time and other novels that until recently were considered, in Shirky's cutting phrase, "Very Important in some vague way." Indeed, we've "been emptily praising" writers like Tolstoy and Proust "all these years." Our old literary habits "were just a side-effect of living in an environment of impoverished access." Now that the Net has granted us abundant "access," Shirky concluded, we can at last lay those tired habits aside.
Such proclamations seem a little too staged to take seriously. They come off as the latest manifestation of the outré posturing that has always characterized the anti-intellectual wing of academia. But, then again, there may be a more charitable explanation. Federman, Shirky, and others like them may be early exemplars of the post-literary mind, intellectuals for whom the screen rather than the page has always been the primary conduit of information. As Alberto Manguel has written, "There is an unbridgeable chasm between the book that tradition has declared a classic and the book (the same book) that we have made ours through instinct, emotion and understanding: suffered through it, rejoiced in it, translated it into our experience and (notwithstanding the layers of readings with which a book comes into our hands) essentially become its first readers." If you lack the time, the interest, or the facility to inhabit a literary work -- to make it your own in the way Manguel describes -- then of course you'd consider Tolstoy's masterpiece to be "too long, and not so interesting."
Although it may be tempting to ignore those who suggest the value of the literary mind has always been exaggerated, that would be a mistake. Their arguments are another important sign of the fundamental shift taking place in society's attitude toward intellectual achievement. Their words also make it a lot easier for people to justify that shift -- to convince themselves that surfing the Web is a suitable, even superior, substitute for deep reading and other forms of calm and attentive thought. In arguing that books are archaic and dispensable, Federman and Shirky provide the intellectual cover that allows thoughtful people to slip comfortably into the permanent state of distractedness that defines the online life.
Our desire for fast-moving, kaleidoscopic diversions didn't originate with the invention of the World Wide Web. It has been present and growing for many decades, as the pace of our work and home lives has quickened and as broadcast media like radio and television have presented us with a welter of programs, messages, and advertisements. The Internet, though it marks a radical departure from traditional media in many ways, also represents a continuation of the intellectual and social trends that emerged from people's embrace of the electric media of the twentieth century and that have been shaping our lives and thoughts ever since. The distractions in our lives have been proliferating for a long time, but never has there been a medium that, like the Net, has been programmed to so widely scatter our attention and to do it so insistently.
David Levy, in Scrolling Forward, describes a meeting he attended at Xerox's famed Palo Alto Research Center in the mid-1970s, a time when the high-tech lab's engineers and programmers were devising many of the features we now take for granted in our personal computers. A group of prominent computer scientists had been invited to PARC to see a demonstration of a new operating system that made "multitasking" easy. Unlike traditional operating systems, which could display only one job at a time, the new system divided a screen into many "windows," each of which could run a different program or display a different document. To illustrate the flexibility of the system, the Xerox presenter clicked from a window in which he had been composing software code to another window that displayed a newly arrived e-mail message. He quickly read and replied to the message, then hopped back to the programming window and continued coding. Some in the audience applauded the new system. They saw that it would enable people to use their computers much more efficiently. Others recoiled from it. "Why in the world would you want to be interrupted -- and distracted -- by e-mail while programming?" one of the attending scientists angrily demanded.
The question seems quaint today. The windows interface has become the interface for all PCs and for most other computing devices as well. On the Net, there are windows within windows within windows, not to mention long ranks of tabs primed to trigger the opening of even more windows. Multitasking has become so routine that most of us would find it intolerable if we had to go back to computers that could run only one program or open only one file at a time. And yet, even though the question may have been rendered moot, it remains as vital today as it was thirty-five years ago. It points, as Levy says, to "a conflict between two different ways of working and two different understandings of how technology should be used to support that work." Whereas the Xerox researcher "was eager to juggle multiple threads of work simultaneously," the skeptical questioner viewed his own work "as an exercise in solitary, singleminded concentration." In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestowed on us. We have cast our lot with the juggler.
Excerpted from The Shallows: What the Internet Is Doing to Our Brains by Nicholas Carr. Copyright 2010 by Nicholas Carr. Excerpted by permission of W.W. Norton & Co.