Wikipedia was founded in 2001, but the critical ideas and developments that helped shape the site were developed long before that. These ideas are listed below in chronological order. They show a quickening pace, especially after 1990 when the World Wide Web became a concrete proposal. Throughout the 1990s, technology progressed. New ways of thinking about tools emerged, and thoughtful and innovative developments combined to affect the content and implications of computer technology. These developments have produced ideas that are shaping the world. Wikipedia is part of a long tradition that predates the Internet, however, and some much older ideas feed into Wikipedia's culture—not least of which is the revolutionary concept of the encyclopedia.
What is an encyclopedia? To most people, an encyclopedia is a large book or multivolume work. Comprised of a comprehensive collection of short articles, an encyclopedia divides an area of knowledge into separate topics. Encyclopedias are reference works, designed to orient new readers, summarize details that might have previously been spread over many publications, and provide a summary of available information in comprehensible terms. A good encyclopedia can answer many questions, without replacing the sources from which it was constructed.
Encyclopedias are examples of tertiary sources. They are neither primary sources, such as historical documents, nor are they secondary sources, such as textbooks, which usually discuss, report on, or interpret primary sources. Instead, an encyclopedia's compilers have gathered and summarized available secondary sources (often noting primary sources as well) to report on a field of knowledge and current thinking at that particular time.
The encyclopedia has venerable origins. Early examples exist in manuscript form in cultures around the world, and bound encyclopedias have been around almost as long as there have been books at all. Pliny's enormous Historia naturalis, written in 77 AD, is often cited as one of the first encyclopedias; this work was influential for at least 1,500 years. Some of the other very first encyclopedias were written in Chinese (the now-lost Huang Ian, published around 220 AD) and Arabic (the 10-volume Kitāb 'Uyūn al-Akhbār, or Adab al-Kitāb, compiled around 880 AD). Throughout the medieval era in Europe, other encyclopedic works were developed, many written in Latin and based around philosophical and religious ideas.
The word encyclopedia was not used to describe these works until much later, however. So where did this word originate? Wikipedia itself provides this explanation, crediting the 16th-century scholar Joachim Sterck van Ringelbergh (Figure 2-1):
The word encyclopedia comes from the Classical Greek "ὲγкύкλια παιδεία" (pronounced "enkyklia paideia"), literally, a "[well-]rounded education," meaning "a general knowledge." Though the notion of a compendium of knowledge dates back thousands of years, the term was first used in 1541 in the title of a book by Joachimus Fortius Ringelbergius, Lucubrationes vel potius absolutissima kyklopaideia (Basel, 1541). The word encyclopaedia was first used as a noun by the encyclopedist Pavao Skalic in the title of his book, Encyclopaedia seu orbis disciplinarum tam sacrarum quam prophanarum epistemon (Encyclopaedia, or Knowledge of the World of Disciplines, Basel, 1559). (From [[Encyclopedia]], April 2007)
The earliest encyclopedias compiled knowledge about the entire world and were meant to be read straight through as a complete education.[3] This notion eventually evolved into the more modern concept of an encyclopedia as a reference work, more akin to the concept of a dictionary in which words are defined for easy consultation. (Encyclopedic dictionaries, a hybrid form, have existed since at least the second century AD.) An encyclopedia in the contemporary sense may illustrate objects, map places, contain articles about history, geography, science, and biography, and cover the spectrum of factual knowledge.
In the modern age, traditional encyclopedias have worked hard to balance the topics important to their audience with limited space and editorial capacity. Generalist encyclopedias aim to be universal in scope, while being compact enough to be fully updated every few decades and to fit on a bookshelf. Specialist encyclopedias can fill a similar amount of space for one field or subfield. A general children's encyclopedia such as World Book is written with a different format and goals than a scientific encyclopedia, but both provide clear introductions to topics. This formula has been a successful one, providing publishers with high sales continuing from the 18th century to today.
Today thousands of specialist encyclopedias are in print (Figure 2-2 shows one of these, the Encyclopedia Lituanica, an English-language six-volume encyclopedia on Lithuania). General encyclopedias have become household names: Encyclopaedia Britannica [4] and World Book for English speakers, the German Brockhaus, and the French Larousse. The Great Soviet Encyclopedia grew to 100,000 articles in Russian and produced encyclopedias in other languages of the USSR.
Figure 2-2. The six-volume Encyclopedia Lituanica, published from 1970 to 1980 in Boston, Massachusetts
The encyclopedia as we know it today was strongly influenced by the 18th-century European Enlightenment. Wikipedia shares those roots, which includes the rational impetus to understand and document all areas of the world.
Jonathan Israel[5] cites the Grand Dictionnaire of Louis Moréri (Figure 2-3) as being the first modern encyclopedia. Published in 1674, it ran to many editions over half a century. Then, as now, times were changing: The previous decade's Royal Society of London was composed of amateurs, mostly outside the universities, but they were pioneers of learned society and the modern scientific method. The new media of the time were journals, such as the Royal Society's Philosophical Transactions, which were used to spread knowledge of scientific discoveries and theories. According to Israel, by the decade after Moreri's compilation appeared, the new institution of the learned journal threatened existing authority.
By the Enlightenment, the Renaissance concept of the polymathic uomo universale or universal man had been stretched to its limits. Science and exploration had added many facts to the body of knowledge, and no one person could grasp everything significant.
Encyclopedia editors made fields of knowledge available to the reading public by coordinating the efforts of leading scholars and intellectuals and condensing the available information. Israel writes that "these massive works … were expressly produced for a broad market." He mentions the "stupendous" 64-volume Zedler Universal-Lexicon in German (published 1731–1750); he also comments on the sheer expense of a well-stocked library at that time.[6] Access to general information was now available for the prosperous middle class; it was no longer confined to the rich and those actively involved in the intellectual networks.
The new generation of encyclopedias, of which the best-known is Denis Diderot's provocative French Encyclopédie, ou dictionnaire raisonné des sciences, des arts et des métiers (Encyclopedia, or a systematic dictionary of the sciences, arts and crafts), were general works. They included all areas of knowledge, from the technical to the esoteric to the theological.
Wikipedia carries on these encyclopedist traditions but with some radical changes. The most obvious change is technological: Wikipedia stores information online, so its scope is not limited by the economics of printing.
Wiki page structure encourages many short articles rather than a few long ones. This works because pages are hypertext: a collection of articles linked back and forth. Earlier encyclopedias used footnotes and indexes as a way to link to other articles, but Wikipedia uses hypertext to its full potential, giving it a very different organizational style compared to the printed page. This extensive linking extends beyond articles in the English-language version: Wikipedias in different languages, from French to Swahili (Figure 2-4), are cross-referenced with tens of millions of links, as described further in Chapter 15.
As described in Chapter 1, Wikipedia editors encounter the same issues that the original encyclopedia editors did—what topics to include and how to present them—and address these issues by developing content standards and style guidelines. Articles should be concise surveys, not personal essays: complete, accurate, and objective. They should summarize topics quickly in the lead section, as dictionaries do. These stylistic guidelines help Wikipedia fulfill the encyclopedia's traditional function: People consult the site for rapid introductions to a subject, written for the general reader.
Wikipedia's scope is far greater than previous encyclopedic projects, however. Encyclopedias have traditionally been published as comprehensive guides to some defined area of knowledge. Wikipedia is instead a collection of both specialist and generalist encyclopedias, linked together into an integrated work. Its articles can be updated immediately: Articles are dynamic, and their content can change from day to day or even (in the case of current events) from minute to minute. Wikipedia's huge scale and rapid updating is possible in part because the authorship model is completely different from earlier projects: The idea of the famous author or expert-written article has been discarded.
Finally, unlike earlier encyclopedias, Wikipedia is a noncommercial project, and its content is deliberately licensed so others can freely use it. This ease of access alone is surely far beyond what the early encyclopedists hoped for.
Looking ahead several hundred years, we'll now explore the technological part of Wikipedia's heritage: the free software movement, the development and widespread growth of the Internet and the personal computer, and the development of wiki technology.
During the late 1960s, two key developments in computing technology occurred. The first was the beginning of the modern operating system essential to networked computing. In the 1960s, the computers in the public eye were the hugely expensive S/360 series of mainframe computers from IBM, whose twitching tape drives became iconic for speedy electronic brainwork. Meanwhile, comparatively disregarded at the time, the Unix operating system at Bell Labs was created on a humble PDP-7 minicomputer from the Digital Equipment Corporation. (According to legend, the machine had been recycled after having been left in a corridor.) Unix ultimately became one of the most widely used operating systems for the servers that power the Internet, continuing to flourish long after the IBM mainframes became hardware dinosaurs and inspiring a variety of free software projects.
During this same time period, the groundwork for the network that would become the Internet was laid. Called ARPANET, the original Internet was a US Department of Defense project first theorized in the 1960s. Along with other networks, ARPANET provided some of the first connections to universities and research institutions. Later, the technology behind this network became available for new networks available to consumers: The first email service was offered by CompuServe in 1979, the same year newsgroup software was developed.
A decade later, Tim Berners-Lee would develop a networked implementation of the idea of hypertext, an idea that would become the World Wide Web. With the development of web browsers in the early 1990s, consumers, who had been buying personal computers since the mid-1970s (a phenomenon that became widespread with the introduction of the Apple II in 1977), could now "go online" and participate in the growing Internet. These developments, occurring over just a few decades, completely reshaped the modern world and made large online projects like Wikipedia possible. The advent of personal networked computing also provided the necessary technical background for the cultural ideas of free software and online communities, which are critical to Wikipedia's development.
In the early 1980s, Richard M. Stallman, a software developer at MIT's Artificial Intelligence Lab, became alarmed at what he saw as a loss of freedom for computer programmers. Stallman had spent two decades working in a collegial environment, where changing or amending software was technically feasible and clear of legal worries. If someone needed someone else's computer program, he just asked for and adapted it.
As explained on Wikipedia:
In the late 1970s and 1980s, the hacker culture that Stallman thrived in began to fragment. To prevent software from being used on their competitors' computers, most manufacturers stopped distributing source code and began using copyright and restrictive software licenses to limit or prohibit copying and redistribution. Such proprietary software had existed before, and it became apparent that it would become the norm. […]
In 1980, Stallman and some other hackers at the AI lab were not given the source code of the software for the Xerox 9700 laser printer (code-named Dover), the industry's first. (From [[Richard Stallman]], April 2007)
While Stallman and other hackers had been able to customize another lab printer so that a message was sent to users trying to print when there was a paper jam, they could not do so with Dover—a major inconvenience, as the printer was on a different floor. Stallman asked for the printer software but was refused; this experience and others convinced Stallman of the ethical need for free software.
Software, now produced by companies such as Microsoft, was owned and controlled, and sharing it entailed breaking a license and breaking the law. Source code—the version of a program necessary to make changes—was frequently not made available. You couldn't customize software, even after you paid for it.
In 1983, Stallman announced the GNU operating system project and two years later founded the Free Software Foundation. In an essay titled "What is Free Software?" Stallman declared the freedoms essential for free software:
The freedom to run the program for any purpose
The freedom to study how the program works and adapt it to your needs
The freedom to redistribute copies so you can help your neighbor
The freedom to improve the program and release your improvements to the public so the whole community benefits
The GNU project (whose logo, appropriately enough, features a gnu—see Figure 2-5) set out to build a completely free operating system, inspired by Unix. The acronym GNU was a programmer's joke that stood for GNU's Not Unix. A collaborative project, GNU was largely functional by the early 1990s. In 1991, a young Finnish programmer named Linus Torvalds offered one of the last essential remaining pieces, a kernel.
Torvalds called his project Linux. The combined system of GNU software run on this kernel is known as GNU/Linux and is now widely used by both individuals and corporations. Hundreds of people worldwide have contributed to Linux.[7]
This operating system, which has become the basis of numerous distributions developed for different purposes, has been one of the great successes of the free software movement. Some versions of GNU/Linux are distributed commercially, such as Red Hat Linux. The ideas behind free software have become widespread; other successful examples of free software projects are the Apache software, on which many servers run, and the Mozilla web browser, which millions of people use. Today, freely licensed, collaboratively built software supports work by businesses and individuals worldwide.
GNU developers recognized that new software licenses, which differed from traditional ideas of copyright, needed to be created to preserve the freedom to share these programs legally. Although the rights assigned with copyright have been of concern for a long time—a mention is made of copyright in the US Constitution, which grants Congress the power to "promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries"—the advent of the personal computer and the Internet have magnified and broadened copyright issues. Broadly speaking, copyright law assigns the author of a creative work certain exclusive rights to sell and distribute that work, keeping others from copying and profiting from an author's work without permission. Today, copyright is assigned automatically in the United States and in many other countries when a work is created. However, because copying a work, such as a computer file, is now quick, routine, and costs virtually nothing, many questions have been raised about the place and effectiveness of copyright law in an electronic environment.
As an alternative to traditional copyright, Stallman created the General Public License (GPL) in 1989; today, this license is widely used for free software. This license is an example of copyleft—a movement to protect the freedom of creative works by using new licensing arrangements that incorporate ideas from free software.
As usual, Wikipedia has plenty to say on the matter:
Copyleft is a play on the word copyright and is the practice of using copyright law to remove restrictions on distributing copies and modified versions of a work for others and requiring that the same freedoms be preserved in modified versions.
Copyleft is a form of licensing and may be used to modify copyrights for works such as computer software, documents, music, and art. In general, copyright law allows an author to prohibit others from reproducing, adapting, or distributing copies of the author's work. In contrast, an author may, through a copyleft licensing scheme, give every person who receives a copy of a work permission to reproduce, adapt or distribute the work as long as any resulting copies or adaptations are also bound by the same copyleft licensing scheme. (From [[Copyleft]], April 2007)
By the turn of the 21st century, free software ideas had spread well beyond computer code. In 2000, Stallman created the GNU Free Documentation License (GFDL). The GFDL was conceived of as a complementary license to the GPL but was intended for written works such as software documentation rather than code. Wikipedia adopted the GFDL early on as its license for all content created on the site—a move that guarantees the site's content will remain perpetually free for everyone to use and redistribute.
Wikipedia's approach is tied to the ideals of the free software movement. Both the software on which Wikipedia runs (MediaWiki) and the site's content are freely available for use by anyone to adapt and modify, qualified only by the requirements of their respective GPL and GFDL licenses. Wikipedia's slogan is Wikipedia, the free encyclopedia. No one has to pay to view Wikipedia articles, but free means more than that: Free also means "no strings attached," and this is the consistent goal of the Wikimedia projects. Freedom means free of cost, free of restrictions to change and modify any content, free to redistribute, free for anyone to participate, and free of commercial influences.[8] The GFDL license specifies that any work placed under it may be legally reused and republished by anyone, with the only restriction being that any such republishing must itself also be licensed under the GFDL (and the original authors must be credited). In other words, the license ensures that any GFDL-licensed content is both freely available and open to all. Though contributors to Wikipedia do retain the copyrights to their work, they lose the right to specify what can be done with it.
Thus another site can repackage and profit from Wikipedia articles, as long as it respects the license. In fact, there are many legitimate sites like this, called mirror sites, and anyone using a search engine will come across them often. The only rules are that if a site does copy Wikipedia material, those pages must also be licensed under the GFDL and must acknowledge the content's origin. Because of this clause, the GFDL is sometimes called a viral license: It propagates and perpetuates itself.
Any author adding to Wikipedia should know what the license means. If having personal control over your work matters to you, you should not add it to Wikipedia. Once you have saved your contributions to the site, you've conceded that others can modify them and use them in any way they wish under the licensing terms.
Other works using the GFDL include the book you're reading; its text may be reused under the same conditions. The GFDL requires a history of authorship; on Wikipedia, you can look up the full list of original authors of articles (including pseudonyms, automated edits, and IP numbers) on the page histories of every Wikipedia page we cite. You'll find more about the GFDL and reuse compliance in Appendixes Appendix A and Appendix E.
Tim Berners-Lee, the pioneer of the World Wide Web's technology, has said he always intended for the Web to be interactive. The social and cooperative side of Internet usage is now catching up with that potential, and wiki sites are just one part of a larger pattern.
A wiki is a type of website that anyone can edit. Setting up a wiki creates an effective tool for collaborative group authoring. Simply speaking, a wiki is a collection of web pages, located at a common address on the World Wide Web, that link to each other through their page titles and can be edited online by contributors without special permissions. More technically, a wiki is a kind of database, consisting of pages of HTML, the markup language used on the Web, but wiki pages can be edited by contributors using a simpler markup language.
Structurally, a wiki can contain multiple discussions consisting of many topics and is by its very nature dynamic and changing. Most wikis record the changes that are made to them, keep previous versions of pages, and make it very simple to add clickable links from one page to another page on the site. Openness is a key feature of most wikis as well. You don't need much technical knowledge or special permission to edit most wiki pages; instead, you can change them as you see fit. Wiki pages contrast with conventional web pages that have largely static and uneditable content.
The wiki concept and the name come from Howard G. "Ward" Cunningham, an American computer programmer. Instead of calling his idea QuickWeb, his first idea, he chose the Hawaiian term wiki wiki when setting up his website, WikiWikiWeb:
In order to make the exchange of ideas between programmers easier, Ward Cunningham started developing the WikiWikiWeb in 1994 based on the ideas developed in HyperCard stacks that he built in the late 1980s. He installed the WikiWikiWeb on his company Cunningham & Cunningham's website c2.com on March 25, 1995. Cunningham named WikiWikiWeb that way because he remembered a Honolulu International Airport counter employee telling him to take the so-called "Wiki Wiki" Chance RT-52 shuttle bus line that runs between the airport's terminals. "Wiki Wiki" is a reduplication of "wiki," a Hawaiian-language word for fast. (From [[WikiWikiWeb]], April 2007)
On this original wiki site, meant for the Portland Pattern Repository (Figure 2-6), programmers exchanged ideas on patterns and approaches to programming, forming a somewhat rambling but fruitful discussion space.
In its original concept, a wiki expresses the views of a community with some common interest and brings people together in a shared space for discussing ideas and building resources. The main point of a wiki website is to make it easy for contributors to collaborate in building its content, whatever that content may be. If the site is wide open, what "the community" is may be nebulous, but a wiki community is often simply defined as those people who are editing the site.
A wiki, then, is not simply a technology but a whole approach for a group using a website to collaborate. This approach, which you could call a philosophy, cannot really be expressed by looking at single users or editors: Wikis have a collective aspect. In this, wikis are related to and draw from the culture of other online and open source communities.
For software to be freely available is one thing, for many people to contribute to building the software is another. In an influential 1997 essay, "The Cathedral and the Bazaar," Eric S. Raymond drew on the recent history of Linux development and argued that the open nature of free software allowed for widescale collaboration and development. Raymond coined a new term, open source, with a definition similar to the idea of free software. In the late 1990s, a group of Bay Area computer programmers and Raymond developed an open source movement, which also centered around sharable software but particularly emphasized the pragmatic benefits of collaboratively developed software to companies.
Raymond described how opening up software projects by making source code available and using open development processes could ultimately produce better software by increasing the number of people able to work on it. He coined the aphorism, "Given enough eyeballs, all bugs are shallow," which emphasizes how many different people, all concerned with understanding a program, help to find mistakes and other weaknesses and get them fixed quickly. In the essay, he also writes about the other benefits of using a self-selected group of collaborators who are only acting out of their own passion for the project:
… contributions [to Linux] are received not from a random sample, but from people who are interested enough to use the software, learn about how it works, attempt to find solutions to problems they encounter, and actually produce an apparently reasonable fix. Anyone who passes all these filters is highly likely to have something useful to contribute. (From Eric S. Raymond, "The Cathedral and the Bazaar," presented at Linux Kongress, 1997)
In a comparable way, Wikipedia urges its many readers to become writers, fact-checkers, and copyeditors, allowing anyone to ask a question or fix incorrect information. In a broad sense, the ideas of shared improvement and collective scrutiny are common to wikis, free software, and the concept of an encyclopedia that anyone can edit.
Wikipedia is famous for fostering an elaborate, unusual volunteer community, but Wikipedia is far from being the first online community or the first wiki community. Other groups had already explored the ideas that would become the basis of Wikipedia's social principles.
Dedicated virtual communities have been around since the very beginning of computer networks. As the Internet has grown, hundreds of online communities have developed, each with its own mores and traditions. The idea of community suggests a focus on the individual people involved and how they interact as being key to understanding how these groups function. Wikipedia suggests a definition of a virtual community as being simply "a social network with a common interest, idea, task or goal that interacts in a virtual society across time, geographical and organizational boundaries and is able to develop personal relationships." For instance, some early notable online communities include the following (adapted from [[Virtual community]]):
Usenet, established in 1980 as a distributed Internet discussion system, was one of the first highly developed online communities with volunteer moderators
The WELL, established in 1985, pioneered some aspects of online community culture with many users voluntarily contributing to community building and maintenance (for example, as conference hosts).
AOL offered various forms of chat and gaming from its inception in 1983 and later helped pioneer the contemporary "chatroom." These chatrooms were initially moderated by volunteer community leaders and helped propel AOL to its position as the largest of the online service providers.
The new wiki communities in the late 1990s started with the idea of interacting online, which had been developed by these and many other online communities, and then added the ideas of open mass collaboration articulated by the growing free and open source software movement. But as wikis matured, they had to develop new ideas and principles for how people could collaborate fruitfully on such open, radically different websites.
The people working on the original WikiWikiWeb coined terms and developed ideas that would later become influential in other wiki communities, for instance, that people could take on different roles such as wiki gnomes, who beaver around on the site fixing small points of format and style. They also noticed that content could develop on a wiki in various ways (some better than others), for example, as walled gardens, dense areas of content that the average editor found hard to access.
The conversation continued on one small but influential wiki, MeatballWiki, which was set up in April 2000 by the Canadian Sunir Shah. This wiki attracted those interested in discussing online communities and their dynamics and typical issues. Much of the conversation on MeatballWiki was about the ways in which individual editors tended to respond to the freedom of editing a wiki. The concepts of soft security (security through group dynamics rather than hard-coded limits) and the right to leave (someone should be able to both join and leave a wiki community easily and gracefully) were first discussed here. Users also discussed large-scale concepts that affected the whole community, such as forking and interwiki connections—communities splitting apart or coming together. MeatballWiki continues today, full of essays, discussions, arguments, and musings about what constitutes a healthy, successful online community and what it means to work on a wiki.
Thus, the WikiWikiWeb, MeatballWiki, and other early sites developed the terminology and articulated the principles of structuring community that many wikis, including Wikipedia, operate with today. Wikipedia, in turn, has gone on to apply these ideas in large-scale ways not imagined by these early wikis.
Wikipedia developed in an atmosphere where wikis were already established as a particular kind of online community. The word wiki is sometimes interpreted as a backronym, a back-formed acronym, as if it stood for W-I-K-I. In the style of Internet abbreviations, you could read this as What I Know Is, referring to the knowledge contribution, storage, and exchange functions of wikis. A typical wiki is still reminiscent of notes on an extended brainstorming session: The hypertext structure makes it possible to take up any point in its own smaller discussion thread. The early wikis were precursors to Wikipedia, not only in terms of technology, but also because people saw wiki editing, from the start, as a way to share knowledge. Wikipedia, however, changed the model of wikis from being a continuing conversation among peers to being a project for collating information and building a reference resource—and in so doing, showed that you could build a single work with a large, disparate online community spanning language and geography.
Being a wiki site is not intrinsic to Wikipedia's content. The adaptation of wiki technology, however, has been key to Wikipedia's quick success in an area where previous projects have failed. From the point of view of a technology historian, Wikipedia already deserves to be called a killer app, the sort of application of a technology that in itself justifies the success of wikis. Wikipedia has used its wiki aspects successfully to collate and develop the world's largest encyclopedia so far.
Embracing the history of encyclopedias, the openness of free software, and the easily accessible, collaborative aspects of online communities and wikis meant that Wikipedia was able to draw on both a large pool of technically aware people who saw the benefits of the free software movement as well as many nontechnical people who were attracted to the encyclopedic mission and community structure. A high level of collaboration has been possible in areas that would have been difficult to foresee. For instance, current events articles are rapidly updated, often with a thousand or more edits from hundreds of people in a single day, demonstrating the extraordinarily responsive power of this collaborative tool.
Wikipedia has been an evolving phenomenon from the start. It has grown rapidly and has steadily attracted more attention.
Wikipedia's immediate predecessor was Nupedia. (This was not the first Internet encyclopedia idea, however; Interpedia, a project from 1993, never got off the drawing board.) Nupedia was started by Jimmy Wales, with Larry Sanger serving as editor-in-chief. The project was supported by Bomis, an Internet portal company founded and run by Wales and Tim Shell. Nupedia sought to provide an online encyclopedia website under a free-content license, built from contributed articles. Its model was more conventional, though; it was not a wiki, and contributors were expected to be experts in their fields. The pieces they submitted would only be published to the site after an extensive peer review process. The momentum of the project became lost in these multiple review stages, and only a few articles were ever completed.
Wikipedia was created on January 15, 2001, as an alternative based on an open wiki site. Initially, the site was presented as a way to attract new contributors and articles to Nupedia. (Both Sanger and Wales participated in developing the site in the early days, and there was later some dispute over whether they were "co-founders" of Wikipedia. Sanger left the project in 2002, while Wales continues to play a leading role in Wikipedia today.) To differentiate the site from Nupedia, the new project was named Wikipedia.
Wikipedia was immediately successful. Its wiki setup lowered the barriers to entry, and its reputation grew by word-of-mouth alone—the site has never advertised directly. A few key mentions on popular websites drew notice to the site; in March 2001, a posting was made on the Slashdot website, and in July of that year, it received a prominent pointer in a story on the community-edited technology and culture website Kuro5hin. These stories brought surges of traffic to Wikipedia, including people with technical savvy. Search engines, especially Google, also brought hundreds of new visitors to the site every day. The first major coverage in the mainstream media was in the New York Times on September 20, 2001.
By mid-2001, Wikipedia was beginning to acquire an identity of its own (Figure 2-7). Versions in Catalan, Chinese, German, French, Hebrew, Italian, Spanish, Japanese, Russian, Portuguese, and Esperanto had been created, and technical support had been set up (mostly far from the public gaze, as Jimmy Wales chatted on IRC and discussed issues on the mailing list). More visitors meant more articles were written and also more edits were made to improve existing articles (just as important, if a little harder to quantify). The Recent Changes page showed increasing activity. The project passed 1,000 articles around February 12, 2001 and 10,000 articles around September 7, 2001 (see Figure 2-8 for how Wikipedia appeared around December 2001). Nupedia, by contrast, only completed some 24 finished articles over its lifespan from 2000 to 2003.
Figure 2-7. The Wikipedia logo used from late 2001 until 2003. This logo was designed by a volunteer called The Cunctator and was the winner in an open logo contest. See the progression of the Wikipedia logo over time at http://meta.wikimedia.org/wiki/Meta:Historical/Logo_history.
Today, Wikipedia is a household word (at least in households with access to the Web). By late 2007, the site had become the #8 most visited website worldwide, as measured by Alexa ratings,[9] and the volunteer-based community organization behind Wikipedia has become highly complex, learning from past mistakes and developing institutions. Wikipedia is not only a piece of hypertext; the site is by far the largest and most inclusive cross-referenced single collection of factual information to ever exist. Due in part to this assiduous cross-linking of content, Wikipedia articles are prominent in search engine results; many (if not most) queries on the Web can be answered with a Wikipedia article. Wikipedia is an Internet phenomenon, unlike anything seen before—and it could not have technically existed on a comparable scale until quite recently.
During the early years, Wikipedia was administered (technically, financially, and socially) entirely by volunteers. The hardware and personnel needed to run the site was donated by Bomis. As time passed, however, Wikipedia's needs outstripped the ability of Bomis to meet them. The site's infrastructure (but not its content) is now run by the nonprofit Wikimedia Foundation (WMF), which will be described in depth in Chapter 17.
The WMF, employing a very small staff and governed by a board of directors, has taken on the role of coordinating a very large and disparate group of volunteers from around the world: By 2008, Wikipedias existed in over 250 languages. The Foundation serves as the parent organization for all Wikipedias and sister projects (these other reference projects are described in Chapter 16). Initially based in St. Petersburg, Florida, the WMF moved to San Francisco early in 2008. However, most of the servers that provide Wikipedia's infrastructure are still hosted in Florida, with additional servers in Europe and South Korea.
Figure 2-8. Wikipedia as it appeared in late 2001 (from the Nostalgia wiki, http://nostalgia.wikipedia.org, a browsable version of a snapshot of Wikipedia from 2001)
The Foundation's goals have remained in line with the ideal of volunteers freely creating content and distributing the world's information. Its mission statement is, in part,
to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally…. The Foundation will make and keep useful information from its projects available on the Internet free of charge, in perpetuity.
The rest of the story of Wikipedia belongs in Part IV. There we'll tell you about the current gamut of projects in many languages and about the Wikimedia Foundation. The key ingredients for these projects and the Foundation were already in place after the first six months: developers to work on the software, open authorship of content, an international and multilingual group of contributors, word-of-mouth publicity, and a loose but effective central control of infrastructure, with community-driven lightweight editorial mechanisms.
Wikipedia's growth is still entirely open ended—the project has simplified the problem of where to stop by completely disregarding that question. The number of articles on the English-language Wikipedia might still grow by a factor of three or four, or even more. For instance, information about geography, if added to the same depth for the rest of the world as it has been already for the United States, could swell the English-language Wikipedia to a size between 5 and 10 million articles.
There are better questions to ask, however, than simply concentrating on future growth. How easy is it to find fresh encyclopedic topics? When will the editing community switch to focusing on greater depth and quality for each individual article, rather than on greater breadth of coverage overall? This may well be happening already: Quality of content is becoming just as important as quantity (see Chapter 7 for more on these quality-focused projects and how to get involved).
Enquire Within Upon Everything was a bestselling Victorian reference and how-to book, first published in 1856 (and referenced in the name of Tim Berners-Lee's early web precursor project ENQUIRE). This would perhaps be a better title for Wikipedia, which is gradually becoming a reference about everything. But some caution is still required when using Wikipedia (see Chapter 4), and this is to be expected; the wiki culture has a deep acceptance of imperfection and incompleteness as both inevitable and perhaps even necessary for inspiring a working community.
[3] See Robert Collison, Encyclopedias: Their History Throughout the Ages (New York: Hafner, 1996), 21.
[4] For a critique of the Encyclopaedia Britannica, see Harvey Einbinder, The Myth of the Britannica (New York: Grove Press, 1964). This book by Einbinder, a physicist, is authoritative only for the mid-century editions of Encyclopaedia Britannica; it has a hostile bias, but it contains much interesting discussion and research on general tertiary source issues, such as updating, celebrity authors, science coverage, and humanistic approaches.
[5] See Jonathan Israel, Radical Enlightenment: Philosophy and the Making of Modernity, 1650–1750 (Oxford: Oxford University Press, 2001), 134.
[6] Israel, Radical Enlightenment, 135.
[7] For a discussion of large-scale collaboration sympathetic to Linux, see James Surowiecki, The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations (New York: Doubleday, 2004). For a history of GNU/Linux, see Glen Moody, Rebel Code: Inside Linux and the Open Source Revolution (New York: Basic Books, 2001).
[8] See the Definition of Free Cultural Works (http://freecontentdefinition.org/), which the Wikimedia Foundation adopted for its projects in 2007 (http://wikimediafoundation.org/wiki/Resolution:Licensing_policy).
[9] Alexa is a Web-traffic measuring company that uses data from individuals using the Alexa toolbar (http://www.alexa.com/).