PROLOGUE
The Crisis of Knowledge
THREE OF THE SIX HEADLINES on the front page of the New York Times on the day I happen to be writing this (June 21, 2010) could have as their subhead “Knowledge in Crisis!”
The lead story on this randomly chosen day at The Paper of Record takes a long look at the reasons behind the failure of the supposed fail-safe mechanism on the British Petroleum oil rig that fouled the Gulf of Mexico.1 The five authors explain clearly what a “blind shear ram” is (“two tough blades . . . poised to slice through the drill pipe, seal the well and save the day”), how close it came to working, and what exactly went wrong. The article bounces us from vivid descriptions of the moments when the equipment proved inadequate, to an extensive examination of the claims made by the oil industry, to a discussion of the internal processes of a lax regulatory agency. The article’s controversial upshot: Fail-safe mechanisms sound reassuring but in actuality create a terrifyingly risky “single point of failure.”
The subject of the article may be the BP oil spill, but its real topic is the limits of expert knowledge in tackling complex problems. It attempts to explain what we could have done to prevent the disaster, given how hard it is for us to know what will work. How wide is the inevitable gap between our perfect theories and their mechanical imperfection? How much of what we know depends on what we would like to believe? What are the institutional biases that prevent us from acting on what we know? Can the forces that corrupt knowledge be countered, or should we recognize that knowledge is always going to be degraded by politics and greed?
Below this front-page article is a preliminary investigation of John Updike’s archive, suggesting that the author researched the settings of his novels with a great concern for accuracy—down to the sales figures for Toyota franchises and the look of Florida license plates. The archive reveals rich details about the private life of a writer who carefully controlled the public view of himself—“a one-man gated community,” the article says—but who seems to have prepared for an open house once he died, preserving correspondence and even recording the grades he got on quizzes as an undergraduate at Harvard. The Updike revealed by the archives is at some variance with the Updike we thought we knew. For example, though he encouraged a reputation for revising little, the archives he carefully preserved show that he meticulously reworked his manuscripts.2
The story is about John Updike, but it raises the important question of how we are going to understand artists once they no longer leave paper trails. It’s the paper that Updike collected and preserved that lets us see how much of his fiction is indebted to researching the facts that populate his characters’ world. And it’s the paper with Updike’s penciled markings that allows us to see the effort behind his effortless prose. What will we be able to know about writers whose drafts and markings vanish in a flow of insubstantial bits? Without such a record, how will we be able to observe, as the Times article does, that Updike’s early letters pay scant attention to the Korean War and McCarthyism? How will we know that which can be learned only by observing what is not mentioned if personal archives become as fragile as the magnetic traces on an aging hard drive?
At the bottom of the front page of this particular edition of the Times is a feature about soccer players in the World Cup who fake injury to draw a free kick.3 The article notes that it would be much easier to nab these “actors” if the referees had access to video replays, but that would require sacrificing the flow and spontaneity of the sport—a political and cultural change that the International Federation of Association Football is reluctant to make.
We could read this as a sports story, of course, but it is also about the complex role that knowledge plays in our world. How much does accuracy matter? How much are we willing to let experts intrude in order to get a better ruling? What are the positive aspects of the fallibility of human knowledge? Do we want to let experts swarm onto every literal and figurative field? Doesn’t expertise come with a cost? Does there turn out to be a benefit to letting events have blurry edges of ignorance?
These three newspaper articles from different realms of life are part of a long argument we’ve been having about knowledge over the roughly 2,500 years since we decided it would be useful to distinguish reliable ideas from mere opinions. Despite the constant disputes, the basics of our system of knowledge are quite well defined. Especially for those who have grown up digital, here’s a reminder of how it works:
People study hard and become experts in particular areas. They earn credentials—degrees, publications, the occasional Nobel Prize—that make it easier for us trust them. They write books, teach classes, and go on TV, so that we all can benefit from their hard work. The results of that work go through vetting processes appropriate to the type and importance of its claims, providing us with even more assurance of its accuracy. As new discoveries are made and sanctioned, the body of knowledge grows. We build on it, engaging in a multi-generational project that, albeit with occasional missteps, leads us further along in our understanding of the world. Knowledge is a treasure, knowing is the distinctively human activity, and our system of knowledge is the basis for the hope that we might all one day come to agreement and live in peace.
We’ve grown up thinking that this is how knowledge works. But as the digital age is revealing, that’s how knowledge worked when its medium was paper. Transform the medium by which we develop, preserve, and communicate knowledge, and we transform knowledge.
Within those three front-page New York Times stories we can already discern challenges to some of our most basic ideas about what knowledge is and how it works:
In the three months before it was finally capped, BP’s gushing oil could be seen live on any Web site that cared to embed the video, surrounded by whatever text and links the page’s owner thought important to understand it. The online version of the New York Times article linked to its source data, including “previously unreleased notes scrawled by industry crisis managers,”4 in case we wanted to become our own experts over the course of breakfast. Every blogger is a broadcaster, and every reader is an editor.
Updike’s archive “may be the last great paper trail,” as the article says, and every reader who uses a word processor—or, as most of us say these days, who writes—contrasts the solid traces we used to leave behind with the digital dust we currently leave in our wake: more of it for sure, but also more likely to be blown away by a hard-drive failure or a change from floppies to CDs to DVDs to Blu-Rays to whatever comes next and next after that. A paper archive like Updike’s seems so quaint these days—so manageable in size, so under the control of the person who is its subject. What are people going to know about us when they are left to rake through the acres of drafts and photos we’ve strewn across our hard drives and Facebook pages?
Every spectator of the World Cup can see the replays that the referees cannot, making the lively online discussion among soccer fans more fact-based and knowledgeable than the decisions of the expert judges on the field.
The crisis in knowledge goes far beyond questions raised by articles on the front page of one morning’s newspaper, and far beyond the blurring of lines between readers and editors, authors and biographers, spectators and referees. Our most important institutions are being shaken by questions about knowledge that we thought were as firmly settled as those institutions’ marble and concrete foundations:
Universities are debating whether professors ought to be required to post all their research freely on the Web, rather than (or in addition to) publishing it in prestigious but expensive journals. Further, should a professor who is shaping the discipline’s discussion through her mighty participation in online and social media get tenure even if she hasn’t published sufficiently in peer-reviewed journals?
 
Librarians are enmeshed in a struggle for a workable vision of a future for their institutions, not only debating the merits of new techniques for navigating collections but wondering how to weigh the expertise of the “crowd” against that of those with credentials.
 
Major business consulting firms—once charged with preparing shiny, conclusive reports—now experiment with providing clients with access to a network of experts who represent a divergent range of opinions.
 
Business leaders looking at the overwhelming amount to know in a globalized world are trying new decentralized decision-making processes that more effectively take advantage of the expertise spread out across their networks—modeling themselves on the distributed leadership becoming common at large, Web-based collaborative projects such as Wikipedia.
 
The US intelligence agencies and the State Department are caught up in internal battles between the old “need to know” culture and the new “need to share” mindset. The executive branch of the US government is struggling to define exactly how much and what type of information its agencies should release to its citizens.
 
The sciences find themselves both being enhanced by the efforts of amateurs and having to defend their reliability against amateurs—sometimes fiercely partisan ones—who have access to the same data as the professionals. Even among many respected scientists, the traditional journals are beginning to look like blockages in the system of knowledge, for they can publish so few of the worthwhile submissions. The journal Nature—at the top of the prestige pyramid—has begun its own online site where it publishes without regard to page count, to compete with the new generation of open access journals that have rapidly grown in stature and importance.
 
As for the media, you can hardly get them to stop talking about what they’re going to do about the Internet where there are no editors, and where the old media are perceived as biased and self-involved.
At its worst, this crisis of knowledge is apparent in the jumble of fears put forward as obvious: The Internet is an unedited mash of rumor, gossips, and lies. It splinters our attention and spells the end of reflective, long-form thought. Our children don’t read any more. They certainly don’t read newspapers. Everyone with any stupid idea has a megaphone as big as that of educated, trained people. We form “echo chambers” online and actually encounter fewer challenges to our thinking than we did during the broadcast era. Google is degrading our memories. Google is making us stupid. The Internet loves fervid, cult-driven amateurs and drives professionals out of business. The Internet represents the ascension of yahoos, a victory lap for plagiarists, the end of culture, the beginning of a dark ages inhabited by glassy-eyed chronic masturbators who judge truth by the number of thumbs up, wisdom by the number of views, and knowledge by whatever is the most fun to believe.
And yet, at the very same time, sites such as Politifact.com are fact-checking the news media more closely and publicly than ever before, and Jodi Kantor, a reporter for the New York Times, says that knowing that bloggers will go over every word she writes has made her better at her job.5 Libraries are breaking new ground in using all available data—including contributions from readers—to make it far easier than ever for readers to find and understand the resources they need. Science is advancing at an unheard-of pace thanks to new collaborative techniques and new ways to publish vast amounts of data and troll it for patterns and inferences. Businesses are managing in an era that defies predictions by finding expertise in every corner of their organizations, and across the broad swath of their stakeholders.
So, we are in a crisis of knowledge at the same time that we are in an epochal exaltation of knowledge. We fear for the institutions on which we have relied for trustworthy knowledge, but there’s also a joy we can feel pulsing through our culture. It comes from a different place. It comes from the networking of knowledge. Knowledge now lives not just in libraries and museums and academic journals. It lives not just in the skulls of individuals. Our skulls and our institutions are simply not big enough to contain knowledge. Knowledge is now a property of the network, and the network embraces businesses, governments, media, museums, curated collections, and minds in communication.
That knowledge is a property of the network means more than that crowds can have a type of wisdom in certain circumstances. And, as we will see, it’s not simply that under some circumstances groups are smarter than their smartest member. Rather, the change in the infrastructure of knowledge is altering knowledge’s shape and nature. As knowledge becomes networked, the smartest person in the room isn’t the person standing at the front lecturing us, and isn’t the collective wisdom of those in the room. The smartest person in the room is the room itself: the network that joins the people and ideas in the room, and connects to those outside of it. It’s not that the network is becoming a conscious super-brain. Rather, knowledge is becoming inextricable from—literally unthinkable without—the network that enables it. Our task is to learn how to build smart rooms—that is, how to build networks that make us smarter, especially since, when done badly, networks can make us distressingly stupider.
The new way of knowing is just now becoming apparent. Although we can’t yet know its adult form, some aspects are taking form. Networked knowledge is less certain but more human. Less settled but more transparent. Less reliable but more inclusive. Less consistent but far richer. It feels more natural because the old ideals of knowledge were never realistic, although it’s taken the networking of our culture to get us to admit this.
This book will follow one particular pathway through an impossibly large territory. That’s appropriate, since at the core of knowledge’s new exaltation and transformation is a straightforward acknowledgment of one basic truth we’ve always known but that our paper-based system of knowledge simply could not accommodate: The world is far, far too big to know.