1Introduction

Truth and Light, a now-defunct Dark Web site hosted on the Tor network, had the purpose of bringing people to Christ. As the site’s “About Us” page put it,

We are a very small group of people who in one way or another have become “brothers.” We all are unified in wanting to help you. It may sound strange to you but we do not know you and we want to help you. Let us prove it to you with our kindness. We have no hidden agendas. We are sick of corporations who “Grind the faces of the poor” crushing the needy and injured to get more money for themselves. We are upset with the Governments we’ve trusted who are more concerned with their private agendas then those that suffer underneath them.

Our Purpose: Help those that need it, lift people to Christ, Build His Kingdom.

This site was a blog, soliciting questions from curious visitors. One visitor asked,

why do you believe in what you believe in? Like, how can you just randomly believe in stuff like this??? What about proof??? Proof if it’s real, or nay???

Truth and Light’s response presented their theory of truth and reality:

Is “Proof” what can be proven by science? What about conflicting scientific theories? Is proof what you can detect with your senses? Radio waves cant be detected by your senses, do you disbelieve them? To me true “knowledge” is only given from God. I realize that many people won’t believe what we say, but we can make you believe that we genuinely care for you through our actions.

Another visitor asked a very different question:

Truth and Light! I need your help I have questions! Oh so many questions! … OK so I have been involved in the credit card game for a while during university then stopped, it was mostly centered on getting ccs from a friend that had $ money on the cards and going to a few stores, buying some stuff and selling it on kijiji/facebook etc. However now I heard that you can buy ccs from tor network WITH the pin#???

Rather than take this as an opportunity to turn this credit card fraudster on to Jesus Christ, Truth and Light offered more practical advice:

The truth is there are no legitimate card (Credit or debit) vendors on TOR anywhere. I am sorry my friend.

This is a book about the Dark Web and legitimacy.

Of course, when I refer to the “Dark Web,” or collections of websites that can only be accessed with special routing software, “legitimacy” is probably the last word to come to mind. The term Dark Web very likely evokes some decidedly illegitimate associations: drug markets, unregulated guns for sale, child exploitation images, stolen credit cards for sale, or phishing attacks. You might think of Silk Road, the infamous drug market busted in 2013, after it allegedly made billions of dollars selling everything from psychedelic mushrooms to heroin. Or you might recall the 2015 warning from James Comey, former head of the U.S. Federal Bureau of Investigation, that terrorists are “going dark,” hiding their communications behind veils of encryption and anonymous routing.1 Perhaps you think first of the Ashley Madison data dumps on the Dark Web, where personal information exfiltrated from the adultery site ended up for sale.

Associating Dark Web with “legitimacy” may seem odd, if not wrong. Or perhaps not. For each nefarious use of the Dark Web, we can find beneficial uses: the New York Times set up anonymous whistleblowing systems for people to point out government and corporate malfeasance. The Times also mirrors its content as a Tor hidden service, as does the nonprofit news organization ProPublica. Political dissidents use encryption and anonymizing networks to share their ideas. Bloggers take to the Invisible Internet Project (I2P) to write about personal privacy and computer security. As Jeremy Hunsinger has noted, people use the Dark Web to share and trade knowledge about “political theory, gender studies, physics, chemistry, and engineering,” knowledge that can be empowering to users and thus helps the Dark Web “gain legitimacy through the presence of this information.”2 Online communities develop open-source social networking software on Freenet to escape the confines of corporate social media giants Facebook and Twitter. And, as the quotations that open this chapter show, Christian evangelists have taken to the Dark Web to reach out and offer love to others. These sites promise access to “the real” or “the truth”—that is, legitimate knowledge.

Indeed, legitimacy can be a powerful window into the Dark Web. As I show in the next chapter, Dark Web users and site administrators, journalists and academics, law enforcement agents and dealers of illegal goods all use variations on the word “legitimacy” to describe Dark Web sites, practices, and technologies. They call certain things legitimate and others illegitimate (or, to use the parlance of the Tor- and I2P-based Hidden Answers site, “legit or sh!t”). At the core of this discernment is a trial of legitimacy, where the Dark Web’s uses and meanings are under intense scrutiny by a range of social groups. This trial is more complex than a stark illegitimate/legitimate dichotomy: “legitimacy” is a highly context-dependent term, with many shades of meaning and interpretation. Along the way, declarations of something being “legit,” in contrast to other, illegitimate things, mark moments of power practices.

To explore this trial of legitimacy, I focus in this book predominantly on the users and builders of Dark Web systems and sites. That is, rather than exploring how external entities (say, law enforcement agents or journalists) put the Dark Web through a trial of legitimacy, I am more concerned with the arguments of Dark Web builders, administrators, and participants about which networks, sites, practices, and uses are legitimate. As I show throughout this book, the construction—or denial—of the Dark Web’s legitimacy by network builders and Dark Web site users hinges on questions of violence, propriety, and authenticity. Specifically, I consider the Dark Web’s fraught relationships with the state, the legitimated holder of the monopoly on violence; with corporate or organizational propriety and power; and with the intense adjudication of authenticity, of inclusion and exclusion. The book draws on three years of participant observation, two dozen interviews with Dark Web site admins and users, and analysis of large archives of computer science papers, e-mail lists, and Internet Relay Chat (IRC) logs, all focusing on the makers of, administrators of, and participants in various Dark Web systems.

The consideration of violence, propriety, and authenticity in relation to the Dark Web provides a model for similar analyses of networks, communication, and technology more broadly. For example, for Internet scholars, focusing on legitimacy may help illuminate how states, corporate platforms, and user practices intersect, collide, and grate against one another. Much as Truth and Light justified their Christian evangelical blog on the Dark Web, we have to attend to questions of government, corporate, and social power practices as we consider any networked technology. Likewise, this book should be useful for communication scholars reflecting on how legitimacies are rhetorically constructed through the making of claims. Finally, for sociologists of technology, thinking through these legitimacies may help illuminate how any given technical achievement is, in part, an association of coercion, resources, and social categorization.

What Is the Dark Web?

But what do I mean by the Dark Web?

Journalism, academic literature, and popular books provide competing and contradictory definitions, centering on depth, morality, and technology. I find only the last useful. In terms of depth, a common definition offered is that the Dark Web comprises everything that search engines (i.e., Google, Bing) have not indexed. This could include material behind paywalls or login screens, databases, web pages generated based on short-term data (think of stock quotes or weather reports), or encrypted data. When journalists or academics use this definition of the Dark Web, they tend to suggest that the Dark Web is many times the size of the regular World Wide Web.3 Visually, they use images of icebergs or ocean depths to convey the sheer size and below-the-surface qualities of the Dark Web. This depth definition can be traced back to a 2001 white paper by Michael K. Bergman, titled “The Deep Web,” in which Bergman calls attention to all the resources not easily indexed by search engines.4 The depth approach to the Dark Web conflates Deep Web—which has been consistently defined as websites that search engines (especially Google) haven’t crawled—with Dark Web, which I take to be something else altogether, especially because many search engines do in fact crawl the Dark Web (including a custom Google engine).

Second, there is a definition of the Dark Web that plays on the moral or ethical connotations of “dark,” defining it as basically anything bad that happens on the web. For example, a research team at the University of Arizona sees the “reverse side of the Web as a ‘Dark Web,’ the portion of the World Wide Web used to help achieve the sinister objectives of terrorists and extremists.”5 The research team carries this through in their 2012 book Dark Web, confusingly including terrorist activities in the nonweb virtual world Second Life in the mix.6 Journalist Jamie Bartlett’s 2014 book The Dark Net uses a similar definition, detailing a host of subcultural activities, such as producing pornography, seeking child exploitation images, working on cryptographic systems, and trolling, as “dark” activities.7 In contrast to the use of icebergs and oceans, news stories using this definition tend to use images of hooded, faceless figures hunched over computer keyboards, green text on black backgrounds (a la The Matrix), or hands menacingly reaching through computer screens.

I reject those definitions in favor of a third, centered on technology. I define the Dark Web as websites built with standard web technologies (HTML, CSS, server-side scripting languages, hosting software) that can be viewed with a standard web browser, such as Firefox or Chrome, which is routed through special routing software packages. I do not define these sites in terms of whether Google has crawled them (the “deep” definition) nor based on the legality or morality of their content (the “morally dark” definition). The former is technically misleading, and the latter is subject to contentious debate. Moreover, the latter definition can be applied to a range of Internet technologies, including sites on the regular World Wide Web (including, as I discuss in chapter 7, Facebook).

Thus what makes the Dark Web “dark,” from a technological point of view, is that to access these sites, one must route Chrome or Firefox (or other browsers) through special routing software. This is the key difference between the Dark Web and what I will call the “Clear Web,” the regular World Wide Web. So, to access Dark Web sites on the Freenet network, one must be running the Freenet router. With that router running, Dark Web sites hosted on Freenet can be accessed with a standard browser via “localhost” (often the reserved IP address 127.0.0.1) with a port specified (often 8888). Accessing sites on the Invisible Internet Project (I2P) or on the Tor network can be done through similar techniques. Complicating this technology-based definition, the Dark Web is not singular but a variety of systems. This book explores three—Freenet, Tor, and I2P—but there are more, including ZeroNet and GNUnet, just to name two.

The major differentiating factor between the Dark Web and the Clear Web is that these special routing systems are designed to provide anonymity for both visitors to websites and publishers of these sites. On the Clear Web, when we visit a website, at the very least our Internet protocol (IP) address is logged. IP addresses are key tools to track users across the Internet, thus linking browsing histories to user identities. Similarly, when we visit websites, we can pretty easily figure out where they are based in geographic space, and from there we can link their contents to the identities of publishers. This is especially the case with major corporations using Extended Validation SSL certificates (which produce the HTTPS in our browsers), because these corporate sites’ identities have been verified by third-party certificate authorities (more about this in chapter 7). Conversely, Dark Web technologies hide the IP address of site visitors as well as the physical location of the website publishers. As we browse Dark Web sites, information that could potentially deanonymize us is obscured. Likewise, using Freenet, Tor, or I2P, a publisher can set up a website without revealing the publisher’s physical location or identity.

Thus, although Freenet, Tor, and I2P implement anonymous web technologies in very different ways, they all provide readers and publishers with anonymity by allowing them to browse and publish anonymous websites. Here, the connotation of “darkness” in Dark Web has more to do with encryption, anonymization, and leaving standard communications channels (as in the phrase used by James Comey, “going dark,” meaning avoiding overly public communications channels). “Web” refers, of course, to websites, web browsers, HTML, and CSS.

Using this definition, we can see that the other definitions are flawed. The depth approach is not quite right because it presents the Dark Web as many times the size of the Clear Web. In fact, the Dark Web comprises only thousands, possibly tens of thousands, of sites—far, far fewer than the billions on the World Wide Web.8 The repeated image of the Dark Web as the “iceberg” under the World Wide Web’s “tip” is simply wrong. Moreover, accessing the Dark Web is not quite like penetrating deeper and deeper layers of the web, each one more difficult to access than the last. To be certain, configuring browsers to use special routing software to access Dark Web sites can be daunting to new users, but such configurations and software are well documented and can be installed and running on a computer within minutes. In the case of Tor, a preconfigured version of Firefox, the Tor browser, is available for download. With the routing software in place and a bit of Googling, one can find easily find Dark Web sites to visit. Indeed, some Dark Web sites, especially multivendor markets where vendors and administrators have a financial interest in attracting a lot of traffic, are quite easy to get to and have hired publicists to get the word out about them.9 The now-defunct Tor-based drug market search engine Grams, for example, actively made finding drug vendors simpler. Of course, other Dark Web sites, hidden behind login walls and available only to those who are vetted, are harder to access. But this is the case with the Clear Web, as well, where some sites are easy to find and access and others much harder. If anything, the Dark Web functions much like the regular web—with the key exception that one needs special routing software to access it, software that can protect the identity of site readers and publishers. It is not deeper than the regular web in any logical sense.

And unlike the moral or ethical connotations of “dark,” my definition of the Dark Web as websites only accessible with special routing software does not predetermine any normative judgment about the content these sites contain. As I show in this book, many sites and services on the Dark Web would, at the very least, not rise to the level of terrorist or extremist activities, or even warrant salacious news stories. Some Dark Web sites are downright boring, providing cat facts, highly specialized computer networking technology discussions, an implementation of the ELIZA chatbot, or a means to play chess anonymously. As I describe in chapter 7, some major Internet corporations, such as Facebook, are moving onto the Dark Web. In more generous interpretations, many Dark Web sites might be judged as valuable forums of personal and political expression, allowing political dissidents to express their views without fear of government reprisal, or enabling people to socialize without fear of corporate surveillance. To be certain, the Dark Web contains some very troubling content: stolen personal information, so-called revenge pornography, and child exploitation images, to name a few. But as Bartlett’s book and the Arizona researchers show, this is the case with the Clear Web, too.10

All too often, the depth, morality, and technology definitions of the Dark Web get conflated into a confusing mix. Take for example Gabriel Weimann’s academic article “Going Dark: Terrorism on the Dark Web,” in which he argues that

the deepest layers of the Deep Web, a segment known as the Dark Web, contain content that has been intentionally concealed. The “Dark Web” can be defined as the portion of the Deep Web that contains generally illegal and anti-social information and can only be accessed through specialized browsers. Thus, for example, the Dark Web is used for material such as child pornography, unauthorized leaks of sensitive information, money laundering, copyright infringement, credit card fraud, identity theft, illegal sales of weapons, and so on. … In 2014, journalist Jamie Bartlett in his book The Dark Net describes a range of underground and emergent sub-cultures, including social media racists, cam girls, self-harm communities, drug markets, crypto-anarchists and transhumanists. In recent years, the Dark Web has been moving toward more secretive locations due to the crackdown of government agencies on it.11

Here, Weimann mixes many of the confusing meanings of the Dark Web. To be fair, he notes it comprises websites that “can only be accessed through specialized browsers,” which is somewhat similar to the definition I work with here. But he also notes that it is the “deepest layer of the Deep Web,” as if we can arrange the web into such layers, and that these deep layers contain “anti-social information,” playing on the moral connotations of “dark.” And he cites Bartlett’s book, which studies subcultures we may or may not find morally acceptable (racists, pornography-producing “cam girls,” transhumanists, crypto-anarchists), all of whom we can find on the regular web.12 Moreover, Weimann also suggests, somewhat confusingly, the “Dark Web has been moving toward more secretive locations,” again implying that there are deeper and deeper layers of the web (and that parts of the web “move” to other parts). The problem with such confusing sets of conflations is that they perpetuate the idea that the Dark Web is (a) a massive, deeper layer “underneath” the regular web; (b) comprised solely of illegal, “dark” activities; and (c) only accessible to highly skilled computer users willing to continually delve deeper into the web. It’s a seductive, terrifying trope, the idea that some monstrous collection of horrifying data lurks beyond the reach of the average web user. These connotations can help sell newspapers and security research white papers, but as attractive as they are, they are wrong.13

So, after all this, should I even use the loaded term Dark Web? Why not Anonymous Web, Invisible Web, or Hidden Web? Or why not coin a new name? The truth is that no perfect term describes the systems I am studying, but the Dark Web term has some advantages. First, many participants on the Dark Web use the term, so it is recognizable among those whom I study, even if they also regularly argue over its definition, let alone its accuracy or desirability as a name for anonymous websites.14 Indeed, mindful of the nefarious connotations of “dark,” the Tor Project at one point hired a marketing firm to come up with a new label for anonymous Tor-based websites. The results have been mixed, however, with “onions” or “onionspace” (references to Tor’s top-level domain name, .onion) being proposed but not catching on.15 Thus, even the creators of these networks have trouble moving away from the pithy, provocative, and commonly used moniker Dark Web. Other names, such as Anonymous Web, Invisible Web, and Hidden Web, are sometimes used but have not caught on. Any name I coin would be immediately ignored by the thousands of people who use these systems. The only other terms commonly used are Deep Web and Dark Net. I reject the first because of the reasons I give above. I don’t use the second—which is arguably the most common term—because Dark Web helps narrow the focus to web technologies, as opposed to broader Internet technologies, such as IRC, BitTorrent, or e-mail protocols, which can be routed through the same network software that enables anonymous web publication and browsing. Focusing on the “Web” in Dark Web thus helps limit the scope of this book to sites marked up with HTML and presented in web browsers. Thus, even as the term brings with it some confusion, I use Dark Web (or, since I am writing about three systems, the plurals Dark Webs and Dark Web systems) throughout this book.

Methodology: Dark Web Situational Analysis

How did I arrive at legitimacy as a key lens through which to look at the Dark Web? To put it simply, I felt that the data demanded an engagement with this concept. To show the path toward my focus on legitimacy, I want to take a moment to talk about the methodology of this study. Perhaps the best guide into complex heterogeneous associations, such as the Dark Web, is Adele Clarke’s excellent Situational Analysis: Grounded Theory after the Postmodern Turn.16

Clarke’s emphasis is on “situations,” collections of elements that, through gestalt, become greater than the sum of their parts. Drawing on Chicago sociology and mixing in the feminist scholarship of Donna Haraway, Clarke argues situations are “relentlessly relational and ecological” and must be attended to in their specificity.17 Many elements go into any given situation: visual features, “knowing subjects,” discourses, narratives, histories, technical capacities, and materialities.18 In this, Clarke is drawing on the actor-network theory within the school of science and technology studies, which includes scholars such as John Law, Madeleine Akrich, Michel Callon, Annemarie Mol, Susan Leigh Star, and Bruno Latour. Clarke is also deeply indebted to Michel Foucault. All these scholars, from Clarke to Haraway, Law to Foucault, demand that the researcher attend to a bewildering array of objects, from other subjects to images to technical infrastructures. Moreover, rather than seeking ultimate causes for a situation, the concern is with the situation as such and the relations among its elements. As John Law puts it, instead of tracing back to causes, the researcher considers elements as “effects” of larger networks.19

Researching such situations is a daunting task. For Clarke, the way forward is through charting relationships between heterogeneous elements involved in a situation: discourses, visual elements, and nonhuman elements. The goal is to answer questions such as “Who and what are in this situation? Who and what matters in this situation? What elements ‘make a difference’ in this situation?”20 As researchers trace relations among these elements, they pay attention to both visibility and invisibility, presence and absence, voice and silence.21

To trace such relations, I draw on three main streams of data. First, I rely on Foucaultian genealogical sensibilities by exploring archives or building new ones.22 Any student of Dark Web systems has quite a few preexisting archives to draw on, including those of developer mailing lists, IRC logs, wikis (including their version histories), code versioning systems, and online forum posts. For example, an important resource for chapter 4 is Gwern Branwen and colleagues’ archive of Darknet Markets.23 These markets are important, purposely archived sources. In addition, thousands of interactions are happening right now on Dark Web social networking sites, bulletin boards, forums, and chat systems. Some of these interactions are more or less persistent, being stored and visible on these sites for months or even years, but many are ephemeral: Dark Web sites regularly appear and then disappear after a few months or days.24 Throughout my research, I sought to capture such items (using screenshots and Zotero web archiving) and construct my own coded archives of Dark Web interactions, but of course I missed much more than I could gather.25

Whether intentionally archived (as in the case of mailing lists) or archived by me or other researchers, these data provide not only a wealth of textual information, but also visual artifacts—logos, avatars, shared photos, memes—all of which could be “mapped” in the sense Clarke describes. Thus, textual and visual information combine into a multimodal form, a discourse that shapes and is shaped by social interaction and that reveals traces of power dynamics. As Clarke argues, “If knowledge is power in the Foucaultian sense, attending to the ways in which knowledges are produced, legitimated, and maintained through language/through discourse/through discursive practices becomes central in analyzing power of all kinds.”26 Indeed, looking ahead to the key term of the book, legitimacy, power practices are a central concern, and thus I must attend to how power relations appear in these archives.

The second main stream of data for this project is drawn from participant observation and interviewing. For example, I’ve made accounts on dozens of Dark Web sites over three years, paying attention and in some cases contributing to the daily life of interactions on these sites. I contributed to Dark Web wiki pages, ran a blog, collaborated on a privacy policy for a social networking site, inserted Freenet files, hosted my homepage on Tor and I2P, and helped co-edit a Dark Web literary magazine. I’ve gotten to know key members of several Dark Web sites, and in two dozen cases, I moved from interactions “in public” (which is to say in the more “public” portions of these hidden sites) to “private” conversations and “branching and building” semistructured interviews (following, of course, principles of informed consent and confidentiality).27

For guidance here, I turn to the work of digital ethnographers, such as Nancy Baym, Annette Markham, Monica J. Barratt, Tom Boellstorff, Alexia Maddox, and Gabriella Coleman. Boellstorff, author of Coming of Age in Second Life, offers an especially invaluable methodological insight. Whereas many ethnographers seek to ground online interaction in offline identities—a move that certainly adds to the researcher’s understanding of the dynamics of online interaction—Boellstorff chose to treat the virtual Second Life as a culture unto itself, deciding not to link Second Life avatars and activities to their “First Life” counterparts.28 For the participant observation and interviewing I engaged in for this project, I did not have such a choice: as a rule, Dark Web site participants do not reveal any personal information that could be used to resolve their online identities to offline identities, because Dark Web systems provide a great deal of anonymity.29 This is the case even for those engaged in seemingly mundane activities, such as sharing recipes or playing chess. Thus, especially in moments of informing potential interviewees about my position as a researcher seeking to publish articles and books, I stressed that I was not seeking any personal information (age, gender identity, location, ethnicity, etc.). But, in the spirit of Boellstorff’s work, interviewees still had much to offer even without anchoring their online identities in their offline identities, including insights into power dynamics, daily practices, and histories of the sites I studied.

Finally, I explored the nonhuman aspects of these networks, especially the networking software they rely on. Fortunately, the networks I consider in this book (Freenet, Tor, and I2P) are all open-source projects, which means that their source code and documentation are open for inspection and that they are built through a collaborative, iterative process.30 And of course, to access Dark Web sites themselves, I had to download, install, run, and update these software packages on my computers, tablets, and smartphones. My engagement with this software is in many ways just as intense as my engagements with the Dark Web participants: each software package demands configuring, updating, and constant attention, especially because of the discourses about privacy and security that accompany them (in other words, the ideal is to keep the software and its dependencies up-to-date to avoid security vulnerabilities). This combination of open code, open documentation, and the experience of running software provides more data that can help me understand the Dark Web situation.

Here, I draw on insights from the field of software studies, which includes scholars such a Matthew Fuller, Anne Helmond, David Berry, and Wendy Chun. Researchers in this field consider software in layers, from operation, interface, functions, and lines of code, down to the hardware platforms on which the code runs.31 Following Rob Kitchin and Martin Dodge, I thus paid attention to Dark Web “software as both product and process … [which] needs to be understood within a framework that recognizes the contingent, relational, and situated nature of its development and use.”32 As products, anonymizing network software packages are produced by many different types of developers, ranging from self-taught, self-described hackers to PhD-holding computer scientists specializing in encryption algorithms, many of whom work from different locations around the world. Moreover, such software is produced through open-source practices, which include combinations of ad-hoc and formal organizational structures, control of software versions, and lively technical debates.33 As processes, they run in the background on a computer, networking the computer with others around the world, shunting data to and fro, and shaping interaction with protocols. We can interface with them in various ways, through command lines or graphical interfaces, both of which carry certain assumptions about how end users are to be configured.34 Thus, the Dark Web software systems described in this book offer rich insights into the power relations of this situation.

Any of these research streams on their own would, I feel, be inadequate for a wider view of the Dark Web systems discussed in this book. Taken together, however, they provide multiple layers to move across. To digest this array of data (archival, interview, and software), I’ve found Clarke’s pragmatic, detailed approach to be extremely valuable. Her emphasis on diagramming diverse situational elements, including technologies, social groups, cultural tropes, social institutions, and debates, is a key approach I take in this book.

Pragmatic Keyword Analysis

In the course of observing, participating in, and mapping the relationships among the discursive and technical elements of the Dark Web, the curious term “legitimacy” and its variants came up again and again. As I show throughout this book (starting especially in chapter 2), this term appeared in offhand comments made on Dark Web social networking sites. The term is commonly used in markets, where new users anxiously seek to distinguish legit vendors from scammers (note its use in the Truth and Light example that opens this chapter). The term appeared in Clear Web coverage of the Dark Web, especially in reports on the efforts of law enforcement to find and shut down illegitimate websites. It appeared in comments made about changes to encryption algorithms in the code, and in comparisons between illegal and legitimate business models.

Legitimacy thus became for me what Clarke calls a “sensitizing concept” and what Colin Koopman and Tomas Matza call a “category.”35 Legitimacy became a lens with which to look at the Dark Web. And, as we will see, this is a trifocal lens—or, perhaps better, a progressive lens, as various connotations of “legitimacy” traffic into one another in a symbolic economy. Although focusing so much on this one category of inquiry may appear to limit the analysis of the heterogeneous Dark Web situation, the multivalent uses of “legitimacy” among Dark Web participants and commenters offer complex insights into power practices, social organization, and technological development, even as the concept helps narrow the focus of the book.

Thus, to consider this sensitizing concept, I turn to pragmatic keyword analysis. Here, a good guide is Nicholas A. John’s excellent book The Age of Sharing. In his study of the word “sharing”—an important term for today’s social media practices—John takes up a “pragmatic approach” from linguistics. That is, rather than asking “What practices should we call sharing?,” he asks, “What practices do we call sharing?”36 In other words, rather than seeking to adjudicate which activities can properly be called “sharing” and which should not, John is more interested in the lively, messy, multiplicity of meanings of sharing as the term is used across various domains. In his book he considers “the sharing economy” alongside sharing one’s thoughts alongside sharing a portion of one’s food, with these overlapping and sometimes contradictory meanings revealing “insights into contemporary culture, and especially digital culture.”37

John draws on the work of Raymond Williams, particularly his book Keywords. Based on situational analysis of Dark Web systems, from computer science paper archives, developer mailing lists, IRC logs, and thousands of forum posts, to site participant observation and interviews with site administrators and users, to lines of code and software interfaces, I came to see the term legitimacy as a keyword in Williams’s sense. Much like Williams’s keywords, the word “legitimacy” “virtually forced itself on my attention because the problems of its meanings seemed to me inextricably bound up with the problems it was being used to discuss.”38 This problematic word—just as slippery as “sharing”—provides a progressive lens into Dark Web practices, at one moment drawing attention to violence, then to propriety, and then to authenticity. With this concept in mind, and following the iterative process Clarke advocates for, I then returned to Dark Web sites and further refined the analysis, using legitimacy as a sensitizing lens to rethink the archival and interview data I had collected. This book is a product of these approaches.

Plan of the Book

Because this is a book about legitimacy, the next chapter, “Violence, Propriety, Authenticity: A Symbolic Economy of the Dark Web,” presents three distinct meanings of that word. First is a meaning that appears predominantly in political philosophy: Max Weber’s conception of a state that has made a successful claim to a monopoly on legitimated force. Thus, this meaning of legitimacy is intimately tied to violence: who can wield it and with what effects. But more precisely, it is tied to struggles over claims to the monopoly on violence in a society. The second meaning of legitimacy I address in chapter 2 is found in organizational and managerial communication: legitimacy as propriety, in the double sense of respectable behavior and proprietorship—in other words, commanding both respect and resources. Whereas the power practices of states involve violent force (military interventions or policing), for organizations, the struggle is over which organizations command what resources, and how well those claims to resources are respected by other social groups. Finally, a meaning of legitimacy not often explicitly defined can be found in streams of writing about popular culture: legitimacy as authenticity, or “legit.” This form of legitimacy is tied to communities of practice, who develop—and monopolize—shared sets of symbols and languages. Whereas power at the state level can be expressed through violence, and power in an organization is often expressed through command of resources, power among the “legit” is tied to social inclusion and exclusion. After laying out these three meanings in chapter 2, I present a symbolic economy by which legitimacy is trafficked across domains, with methods such as inheritance, exchange, appropriation, purchase, and delegitimation. All the meanings of legitimacy and symbolic economic practices associated with legitimacy are tied back to statements made by Dark Web participants and commenters.

Chapter 3, “The Dark Web Network Builders,” details the development history of the three Dark Web systems discussed throughout this book: Freenet, the Tor Project, and the Invisible Internet Project (I2P). I trace how each project developed networks that can anonymize both readers and publishers of web technologies. Thus, the chapter emphasizes the importance of web publishing on these networks, which was not necessarily the original intention of the network builders but nonetheless emerged quickly as the networks took shape. Web publishing on these anonymous networks became known as Dark Web publishing. I also consider the projects’ places within the three legitimacies (violence, propriety, and authenticity), examining the relationship between these projects and states, the ways in which these projects appear as organizations, and the struggles over authenticity as project developers contest one another’s network designs. As the chapter shows, the Freenet, Tor, and I2P projects have each engaged in complex negotiations with state power, organizational propriety, and the performative dimensions of being legit software developers who can make successful anonymous networks.

The heart of the book focuses on the specific forms of legitimacy in turn. Chapter 4, “From Agorism to OPSEC: Dark Web Markets and a Shifting Relationship to the State,” takes on the relationship between Dark Web markets and state violence. Specifically, I consider a shift in thinking among Dark Web market participants about the state’s claims to a monopoly on legitimated violence. The chapter starts with agorism. Agorists are radical market libertarians who believe that state violence is illegitimate, and that justice and security ought to be solely distributed via market mechanisms. Agorism became a dominant political ideology of the Silk Road, the first major Dark Web drug market. The Silk Road agorists argued that selling drugs outside state control would undermine the state altogether. However, the Silk Road was seized in 2013 and its founder arrested. Rather than curtailing political thinking, the end of the Silk Road ushered in a new relationship to state violence in the form of what I call “OPSEC politics.” OPSEC, or operations security, is a practice originally developed by the U.S. military and later appropriated by Dark Web market participants. I show in chapter 4 how OPSEC politics helps produce new social formations that are more in line with larger discourses associating communication and violence.

The next chapter, “Searching for the Google of the Dark Web,” explores search engines as legitimate organizations. Drawing on interviews with software developers who have taken on the challenge of searching Freenet, I2P, and Tor web content, I consider their claims to legitimacy as propriety, that is, commanding respect and commanding resources. I trace how Dark Web search engines integrate themselves into networks and become obligatory points of passage, mediating between a host of other entities, including users, site administrators, law enforcement agents, and software protocols. I conclude with an analysis of the techniques by which Dark Web search engines lay claim to an important inheritance: the legitimacy of being called the “Google of the Dark Web.”

Chapter 6, “Being Legit on a Dark Web Social Network,” focuses on the final meaning of legitimacy, as authenticity. To illustrate this meaning, I consider how members of a specific Dark Web social networking site, Galaxy2, negotiate the tensions between social networking practices, pseudonymity, and administrative rules, seeking to be “legit” members of the site. While contemporary social networking has a set of now-standard practices (gather friends, gather likes, share content), these practices take on different shapes when introduced into anonymizing networks. Community norms and explicit rules are used by Galaxy2 members and administrators to cultivate a particular site culture. Those who are included in the culture are legit; those who are not legit are excluded from the site. The predominant mode of interaction hinges on building profiles while refraining from offering personal information. As the chapter shows, however, in some cases—specifically when members disclose that they are young and female—the rules of authenticity change in disturbing ways.

Chapter 7, “Facebook and the Dark Web: A Collision,” traces the symbolic economy of legitimacy that resulted in the Tor Project’s successful registration of .onion with Internet standards bodies. While Internet standards are highly technical, backgrounded, and infrastructural, they have profound consequences for the billions of Internet users. As I show in this chapter, they also have consequences for Dark Web systems. Because Internet standards groups recognize .onion—a recognition prompted in no small part by Facebook’s involvement in the process—Tor hidden services can now get Extended Validation certificates. This can lead to more “legitimate” (in the sense of propriety) sites mirroring their operations on the Tor network. In addition, Tor’s success hinged in part on delegitimating rival networks, such as I2P. I conclude the chapter by considering how Facebook’s presence on the Tor network blurs the lines between Dark and Clear Webs.

The book concludes with a short chapter arguing for the value of anonymous political speech in a time of ubiquitous surveillance. I acknowledge the calls to end the development of anonymizing networks because so much illegitimate activity happens on them, but I argue that, in the absence of anonymizing networks and in the presence of increasingly monitored digital communications, we lose a valuable means of political speech and dissent if we shut down the Dark Web.

Caveats and Shortcomings

As a single researcher exploring three anonymizing networks, including their archives, participants, and software systems, I face many shortcomings, ranging from a skills deficit (I have no training in computer science) to strong personal views (I have particular stances on resisting the corporate-dominated Clear Web, as can be seen in my previous published work).39 Here, I want to caution the reader a bit.

Language Limitations

I am a native English speaker. I am an American. These are limitations. There are many sites on Tor, I2P, or Freenet that are in languages other than English. I can read Spanish, but beyond that, I have difficulties with sites in other languages. I2P, for example, has many sites in Russian, and Tor is increasingly seeing a growth in Russian-language users and sites. One might simply suggest that I use an online translation system, such as Google Translate, to read the contents of non-English and non-Spanish sites. Even setting aside the problems of machine translation, however, I believe this would be a privacy violation. Google is, of course, notorious for its practice of absorbing every bit of information it encounters. This includes text entered into Google Translate. I must assume that people running Dark Web sites are doing so in part to avoid being monitored by Google, so I have never fed any site text into Google Translate. Nonetheless, one advantage I have speaking today’s de facto lingua franca is that most Dark Web development (detailed in chapter 3) is conducted in English, and many Dark Web sites use English as a primary language. Even so, the analyses offered in this book are definitely limited by my language inabilities.

The Dark Web Changes Constantly

As Monica Barratt and Alexia Maddox aptly put it, the Dark Web is a constantly “fracturing digital environment.”40 Much of the Dark Web has changed, even during the course of this writing, and much will change after the book comes out. Dark Web sites are notoriously ephemeral, appearing online for a few months and disappearing without a trace. To be certain, some last years, but these are rare. The sites I write about here may very well be gone by the time this book is published. Indeed, two notable Tor hidden services, Grams and Galaxy2, have gone offline during the copyediting phase. No doubt I missed important sites as I did the research for this book. Complicating this situation, there is no Archive.org for Freenet, Tor, and I2P, and the makers and users of these sites don’t think of what they are doing in historical terms, so they rarely save their old content.41

The result of this constant change is that my analysis will be relatively unique: the sites I examine and the people I have interviewed may be impossible for others to find. In light of this, and for the convenience of the reader, I include Clear Web links to the sources I’m drawing on whenever possible. Nonetheless, in many instances, I must refer to sources that are solely available via Tor, I2P, or Freenet. These links will be marked in any endnotes or citations with [Tor], [I2P] or [Freenet]. As discussed above, these links cannot be reached without the use of their respective routing programs, so anyone wanting to follow up and verify my work would have to install the routing software, configure a browser to use that software, and follow the links, assuming they are still active. In those cases where I believe the site owner does not want the link shared, I will withhold the link.

In addition, I focus on three key Dark Web site formats: markets, search engines, and social networking sites. This does not exhaust the types of sites found on Freenet, Tor, or I2P, such as pornography sites, chat sites, forums, and blogs. Like the Clear Web, the Dark Web has a wide range of sites, and this book cannot cover them all.

My Own Legitimacy

Finally, what of my own position as a researcher, or as a Dark Web participant? To put it in terms of this book, am I legitimate? In terms of academic work, at the very least, I hope my deep archival work and several years working with Dark Web systems, in addition to participant observation and interviews with Dark Web administrators and users, provide some answer here. As for my interpretations of all these data: they are of course subject to debate.

On a related note, I will say that I do not believe I am “giving voice” to Dark Web users. They are already quite vocal. I do not think of this work as “representing” them either. Frankly, to use a practice I discuss in this work, at best I can say that I am engaged in an exchange of the legitimacies of the Dark Web, trafficking these legitimacies from one domain into another, in this case from Dark Web sites to an academic study. I also must face the fact that, at worst, I am appropriating the legitimacy of Dark Web makers and participants, simply taking their ideas and presenting them in a book published by an academic press and thus benefiting professionally from the work of others. Mindful of this problem, I have labored to make this book as much an exchange of legitimacy as possible.

Notes