Introduction
1Dami Lee, “Apple Says There Are 1.4 Billion Active Apple Devices,” Verge, January 29, 2019, https://www.theverge.com/2019/1/29/18202736/apple-devices-ios-earnings-q1-2019.
2As of April 2018, there were 7.1 billion global bank cards enabled with EMV (Europay, Mastercard, and Visa) “chip and PIN” technology: “EMVCo Reports over Half of Cards Issued Globally Are EMV®-Enabled,” EMVCo, April 19, 2018, https://www.emvco.com/wp-content/uploads/2018/04/Global-Circulation-Figures_FINAL.pdf.
3This is WhatsApp’s own figure from mid-2017, but even if slightly exaggerated, it is most likely of the correct order of magnitude: WhatsApp, “Connecting One Billion Users Every Day,” WhatsApp Blog, July 26, 2017, https://blog.whatsapp.com/10000631/Connecting-One-Billion-Users-Every-Day.
4Mozilla reported that the percentage of web pages loaded by Firefox browsers using https (encrypted) as opposed to http (not encrypted) passed the 75 percent mark in 2018: “Let’s Encrypt Stats,” Let’s Encrypt, accessed June 10, 2019, https://letsencrypt.org/stats.
5Enigma (directed by Michael Apted, Jagged Films, 2001) is a fictional account of cryptographers working at Bletchley Park, England, during the Second World War and their efforts to decrypt traffic encrypted by Nazi Enigma machines. In Skyfall (directed by Sam Mendes, Columbia Pictures, 2012), James Bond and his master technician, Q, engage in some impressive (and somewhat implausible) analysis of encrypted data. Sneakers (directed by Phil Alden Robinson, Universal Studios, 1992) was arguably a film ahead of its time, featuring two students who hack into computer networks and eventually end up embroiled in the world of intelligence gathering and devices capable of breaking cryptography.
6CSI: Cyber (Jerry Bruckheimer Television, 2015–16) is an American drama series involving FBI agents investigating cybercrimes. It features some unusual cryptographic practices, including the storage of encryption keys as body tattoos! Spooks (Kudos, 2002–11), also known as MI-5, is a British television series about fictional intelligence officers. Several of the episodes feature agents having to make sense of encrypted data, often demonstrating extraordinary abilities to overcome encryption in real time!
7Dan Brown has featured cryptography in several of his books, most notably Digital Fortress (St. Martin’s Press, 1998), which is based around surveillance and a machine capable of breaking all known encryption techniques. Interestingly, Brown’s most famous novel, The Da Vinci Code (Doubleday, 2003), stars a cryptologist but does not, itself, feature any cryptography per se.
8My colleague Robert Carolina argues that cyberspace is not a place but rather a medium of communication. He draws a comparison between cyberspace and televisionland, which was a term used at the advent of television to describe the abstract connection between people and a new technology. Just as “Good morning, everyone in televisionland!” (the opening phrase of the Apollo 7 crew’s first broadcast from space in 1968) seems a preposterous greeting to us today, so Carolina expects the concept of something being “in cyberspace” to eventually fade from use. I tend to agree.
9Cyberspace is an extremely hard concept to define. The novelist William Gibson is widely credited with first using the term, but modern definitions tend to be based on abstract descriptions of computer networks and the data that resides on them. Dr. Cian Murphy (University of Bristol), speaking at Crypto Wars 2.0 (the third inter-CDT cybersecurity workshop, University of Oxford, May 2017), suggested a more concise definition: “I don’t like the word cyberspace—I prefer electronic stuff.”
10According to Internet World Stats (Miniwatts Marketing Group), accessed July 14, 2019, https://www.internetworldstats.com/stats.htm, just over half of the world population is now online.
11“2017 Norton Cyber Security Insights Report Global Results,” Norton by Symantec, 2018, https://www.symantec.com/content/dam/symantec/docs/about/2017-ncsir-global-results-en.pdf.
12Almost half of all organizations claim to have suffered from fraud and economic crime, of which 31 percent is attributed to cybercrime: “Pulling Fraud Out of the Shadows: Global Economic Crime and Fraud Survey 2018,” PwC, 2018, https://www.pwc.com/gx/en/services/advisory/forensics/economic-crime-survey.html.
13These are wild estimates of an unmeasurable quantity. However, they capture the idea that as we do more things in cyberspace, so, too, can we expect to be defrauded more in cyberspace. This particular estimate is from “2017 Cybercrime Report,” Cybersecurity Ventures, 2017, https://cybersecurityventures.com/2015-wp/wp-content/uploads/2017/10/2017-Cybercrime-Report.pdf.
14The computer malware Stuxnet was used to attack the Natanz uranium enrichment plant in Iran, which was noticed to be failing in early 2010. This was arguably the first globally reported example of a significant industrial facility falling victim to an attack from cyberspace. In addition to stirring up emotions around this incident with respect to international politics and nuclear danger, Stuxnet has served as a wake-up call to everyone that critical national infrastructure is increasingly connected to cyberspace. The attack on Natanz did not come directly from the internet, but is believed to have originated from infected USB memory sticks. Much has been written on Stuxnet and Natanz—for example, Kim Zitter, Countdown to Zero Day: Stuxnet, and the Launch of the World’s First Digital Weapon (Broadway, 2015).
15In November 2014, Sony Pictures Studios was subjected to a raft of cyberattacks, resulting in the release of confidential employee information and deletion of data. The attackers demanded that Sony stop the release of an upcoming comedy film about North Korea. See, for example, Andrea Peterson, “The Sony Pictures Hack, Explained,” Washington Post, December 18, 2014, https://www.washingtonpost.com/news/the-switch/wp/2014/12/18/the-sony-pictures-hack-explained/?utm_term=.b25b19d65b8d.
16Widely reported key reinstallation attacks affected the WPA2 protocol, which is a security protocol used to cryptographically protect Wi-Fi networks: Mathy Vanhoef, “Key Reinstallation Attacks—Breaking WPA2 by Forcing Nonce Reuse,” last updated October 2018, https://www.krackattacks.com.
17The ROCA attack exploits a vulnerability in the generation of RSA keys in a cryptographic software library used by smartcards, security tokens, and other secure hardware chips manufactured by Infineon Technologies, which results in private decryption keys becoming recoverable: Petr Svenda, “ROCA: Vulnerable RSA Generation (CVE-2017-15361),” release date October 16, 2017, https://crocs.fi.muni.cz/public/papers/rsa_ccs17.
18The Meltdown and Spectre bugs exploited weaknesses in commonly deployed computer chips and were reported in January 2018 to affect billions of devices around the world, including iPads, iPhones, and Macs: “Meltdown and Spectre: All Macs, iPhones and iPads affected,” BBC, January 5, 2018, http://www.bbc.co.uk/news/technology-42575033.
19The WannaCry cyberattack crippled many older computers in the UK’s National Health Service (and elsewhere) by installing ransomware that encrypted the disks of affected computers and then demanded a ransom to unlock the trapped data. The National Audit Office later published a detailed investigation into the incident and how it could have been prevented: Amyas Morse, “Investigation: WannaCry Cyber Attack and the NHS,” National Audit Office, April 25, 2018, https://www.nao.org.uk/report/investigation-wannacry-cyber-attack-and-the-nhs.
20Comey became somewhat of a legend in cybersecurity circles for various quotes regarding his anxieties over how the use of cryptography hampers law enforcement. In one such statement from September 2014, he was said to be worried about the strengthening of encryption services on various mobile devices: Ryan Reilly, “FBI Director James Comey ‘Very Concerned’ about New Apple, Google Privacy Features,” Huffington Post, September 26, 2014, http://www.huffingtonpost.co.uk/entry/james-comey-apple-encryption_n_5882874. In a May 2015 announcement, Comey was reported to be even more upset: Lorenzo Franceschi-Bicchierai, “Encryption Is ‘Depressing,’ the FBI Says,” Vice Motherboard, May 25, 2015, https://motherboard.vice.com/en_us/article/qkv577/encryption-is-depressing-the-fbi-says.
21Like him or loathe him, Snowden’s revelations have been highly influential, and I will discuss them in much greater detail when I later consider the dilemma created by the use of cryptography.
22Cameron’s answer to his own question was: “No, we must not.” This remark was widely interpreted as proposing a ban on encryption technology: James Ball, “Cameron Wants to Ban Encryption—He Can Say Goodbye to Digital Britain,” Guardian, January 13, 2015, https://www.theguardian.com/commentisfree/2015/jan/13/cameron-ban-encryption-digital-britain-online-shopping-banking-messaging-terror.
23Brandis made this announcement ahead of a meeting of a Five Eyes intelligence alliance meeting: Chris Duckett, “Australia Will Lead Five Eyes Discussions to ‘Thwart’ Terrorist Encryption: Brandis,” ZDNet, June 26, 2017, https://www.zdnet.com/article/australia-will-lead-five-eyes-discussions-to-thwart-terrorist-encryption-brandis.
24Kieren McCarthy, “Look Who’s Joined the Anti-encryption Posse: Germany, Come On Down,” Register, June 15, 2017, https://www.theregister.co.uk/2017/06/15/germany_joins_antiencryption_posse.
25“Attorney General Sessions Delivers Remarks to the Association of State Criminal Investigative Agencies 2018 Spring Conference,” US Department of Justice, May 7, 2018, https://www.justice.gov/opa/speech/attorney-general-sessions-delivers-remarks-association-state-criminal-investigative.
26Zeid stated: “Encryption tools are widely used around the world, including by human rights defenders, civil society, journalists, whistle-blowers and political dissidents facing persecution and harassment. Encryption and anonymity are needed as enablers of both freedom of expression and opinion, and the right to privacy. It is neither fanciful nor an exaggeration to say that, without encryption tools, lives may be endangered. In the worst cases, a Government’s ability to break into its citizens’ phones may lead to the persecution of individuals who are simply exercising their fundamental human rights.”: “Apple-FBI Case Could Have Serious Global Ramifications for Human Rights: Zeid,” UN Human Rights Office of the High Commissioner, March 4, 2016, http://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx?NewsID=17138.
27“The Historical Background to Media Regulation,” University of Leicester Open Educational Resources, accessed June 10, 2019, https://www.le.ac.uk/oerresources/media/ms7501/mod2unit11/page_02.htm.
28Former UK home secretary Amber Rudd was quite open about this issue in October 2017 when she stated: “I don’t need to understand how encryption works to understand how it’s helping—end-to-end encryption—the criminals.”: Brian Wheeler, “Amber Rudd Accuses Tech Giants of ‘Sneering’ at Politicians,” BBC, October 2, 2017, http://www.bbc.co.uk/news/uk-politics-41463401.
29There are numerous books about the rich and fascinating history of cryptography. Simon Singh, The Code Book (Fourth Estate, 1999), is one of the most accessible. The benchmark history of cryptography remains David Kahn, The Codebreakers (Scribner, 1997); but other titles include Charles River Editors, World War II Cryptography (CreateSpace, 2016); Craig P. Bauer, Unsolved! (Princeton University Press, 2017); Alexander D’Agapeyeff, Codes and Ciphers—A History of Cryptography (Hesperides, 2015); and Stephen Pincock, Codebreaker: The History of Codes and Ciphers (Walker, 2006). Mark Frary, Decipher: The Greatest Codes Ever Invented and How to Break Them (Modern Books, 2017) chronologically surveys a number of historical codes and ciphers. Steven Levy’s superb book Crypto: Secrecy and Privacy in the New Cold War (Penguin, 2000) documents the political events in the US relating to cryptography during the latter decades of the twentieth century.
30There are various books about cryptographic puzzles. Examples include The GCHQ Puzzle Book (GCHQ, 2016); Bud Johnson, Break the Code (Dover, 2013); and Laurence D. Smith, Cryptography: The Science of Secret Writing (Dover, 1998).
Chapter 1: Security in Cyberspace
1Many national mints provide detail of currency security features in order to assist with fraud detection. These relate to both the feel of currency, as well as its look. You can learn more about security features of the UK pound coin in “The New 12-Sided £1 Coin,” Royal Mint, accessed June 10, 2019, https://www.royalmint.com/new-pound-coin; of UK banknotes in “Take a Closer Look—Your Easy to Follow Guide to Checking Banknotes,” Bank of England, accessed June 10, 2019, https://www.bankofengland.co.uk/-/media/boe/files/banknotes/take-a-closer-look.pdf; and of US dollar bills in “Dollars in Detail—Your Guide to U.S. Currency,” U.S. Currency Education Program, accessed June 10, 2019, https://www.uscurrency.gov/sites/default/files/downloadable-materials/files/CEP_Dollars_In_Detail_Brochure_0.pdf.
2The UK General Pharmaceutical Council sets standards for pharmacy professionals. Standard 6 is “Pharmacy professionals must behave in a professional manner,” which includes being polite and considerate, showing empathy and compassion, and treating people with respect and safeguarding their dignity: “Standards for Pharmacy Professionals,” General Pharmaceutical Council, May 2017, https://www.pharmacyregulation.org/sites/default/files/standards_for_pharmacy_professionals_may_2017_0.pdf.
3These are not always accurate, since many dangers, such as commercial air accidents, tend to be overestimated in peoples’ minds, while others, such as air pollution, are severely underestimated.
4Financial fraud across payment cards, remote banking, and checks in the UK totaled £768.8 million in 2016: “Fraud: The Facts, 2017,” Financial Fraud Action UK, 2017, https://www.financialfraudaction.org.uk/fraudfacts17/assets/fraud_the_facts.pdf.
5Stefanie Hoehl et al., “Itsy Bitsy Spider . . . : Infants React with Increased Arousal to Spiders and Snakes,” Frontiers in Psychology 8 (2017): 1710.
6“9/11 Commission Staff Statement No. 16,” 9/11 Commission, June 16, 2004, https://www.9-11commission.gov/staff_statements/staff_statement_16.pdf.
7The kingdom of Ruritania is a fictional country in central Europe that forms the setting for Anthony Hope’s 1894 novel The Prisoner of Zenda. I am taking the liberty of using Ruritania to represent a generic state to avoid treading on any diplomatic toes. This use is (shamelessly) inspired by my colleague Robert Carolina’s adoption of Ruritania during his cyberlaw classes.
8There is an increasing amount of advice for customers on how to detect fraudulent electronic communications. See, for example: “Protecting Yourself,” Get Safe Online, accessed June 10, 2019, https://www.getsafeonline.org/protecting-yourself.
9Software written to gather and utilize information relating to an unsuspecting computer user is often referred to as spyware. This type of software ranges from relatively benign tracking software designed to target advertising at the user based on their computer activity, to monitoring software that reports all activity, including keystrokes, to a third party.
10The general lack of understanding of how cyberspace works creates problems for individuals, but it is perhaps even more chronically an issue for societies. A UK government report highlights the economic costs of a broad lack of digital skills, identifying the need to significantly improve digital skills training in schools, higher education, and on-the-job training: “Digital Skills Crisis,” UK House of Commons Science and Technology Committee, June 7, 2016, https://publications.parliament.uk/pa/cm201617/cmselect/cmsctech/270/270.pdf.
11In 2010, a Dutch website called Please Rob Me caused a controversy by combining social media feeds and those of a mobile location app to produce addresses of potentially empty homes. The stated intention was user awareness, but the initiative was condemned by many as irresponsible. Although the PleaseRobMe tool no longer exists, since 2010 the number of location-based apps, as well as the capability and effectiveness of combining data sources to determine this type of information, has significantly increased: Jennifer van Grove, “Are We All Asking to Be Robbed?” Mashable, February 17, 2010, https://mashable.com/2010/02/17/pleaserobme.
12As just one example, in 2016 a platform known as Avalanche was taken down by an international consortium of law enforcement agencies. Based in eastern Europe, Avalanche operated a network of compromised computer systems from which a variety of cybercrimes could be conducted, including phishing, spam, ransomware, and denial-of-service attacks. It was estimated that, at its peak, about half a million computers were controlled by Avalanche: Warwick Ashford, “UK Helps Dismantle Avalanche Global Cyber Network,” Computer Weekly, December 2, 2016, http://www.computerweekly.com/news/450404018/UK-helps-dismantle-Avalanche-global-cyber-network.
13The most infamous example of such a network was the one set up by the Ministry for State Security (Stasi) in East Germany between 1950 and 1990. The Stasi engaged over a quarter of a million East German citizens in an espionage network designed to monitor the entire population for signs of dissident activity.
14Just pause, for a moment, to reflect on how much each of your mobile phone, search engine, and social media providers might know about your daily activities from the data you generate when interacting with them. Now imagine how much more they would all know if they shared this information. Less hypothetically, type “employee monitoring” into your favorite search engine (adding slightly to what they already know about you). The results might disturb you.
15Cryptography underpins all forms of financial transactions, including those made using ATMs, debit and credit cards, and the global SWIFT (Society for Worldwide Interbank Financial Telecommunication) network. An annual Financial Cryptography and Data Security conference has run since 1997, dedicated to the theory and practice of using cryptography to protect financial transactions and creating new forms of digital money: International Financial Cryptography Association, accessed June 10, 2019, https://ifca.ai.
Chapter 2: Keys and Algorithms
1Physical letters of introduction are, admittedly, relatively rare these days. However, we still rely heavily on written references for the likes of job applications. Indeed, more intangibly, much of our security in the physical world revolves around what other trusted sources believe about situations. For example, a previously unknown person might be introduced to us by a friend; this is, in some sense, a spoken “letter of introduction.”
2“Open sesame” comes from the story of “Ali Baba and the Forty Thieves,” in One Thousand and One (Arabian) Nights, a compendium of folk tales possibly dating back to the eighth century.
3Note, perhaps confusingly, that the keyboard character “9” is labeled in ASCII as the fifty-seventh keyboard character, so it is represented as the binary equivalent of 57, not the binary equivalent of the decimal number 9.
4Key length is sometimes referred to as key size. I will treat these terms as synonymous.
5There are over 5 billion global mobile phone subscribers: “The Mobile Economy 2019,” GSM Association, 2019, https://www.gsma.com/mobileeconomy.
6This example is based on a lore figure of about 1022 stars in our universe. Star counting is not a precise science, since the number can only be approximated from what we have been able to observe through existing telescopes. Recent estimates place this number at closer to 1024 stars, and many experts suspect this figure is also too low. See, for example, Elizabeth Howell, “How Many Stars Are in the Universe?” Science & Astronomy, May 18, 2017, https://www.space.com/26078-how-many-stars-are-there.html. Counting numbers of cryptographic keys is a much more accurate process!
7The term personal identification number (PIN) tends to be used for a password that is short and consists of numerical digits. The term dates back to the introduction of automatic teller machines (ATMs) in the late 1960s. For our purposes, passwords and PINs are really examples of the same thing—a string of secret characters.
8In fact, this process often does involve cryptography, because most computers do not store copies of your password, but instead store a value computed from your password using a special type of cryptographic function.
9When we submit a PIN to an ATM, we are inherently trusting that the ATM will not misuse it. However, there have been many attacks known as ATM skimming, in which criminals modify an ATM in order to capture card and PIN data (the latter can be captured via the overlaying of a fake keypad).
10One method of doing this is to use the function PBKDF2, which is specified in “PKCS #5: Password-Based Cryptography Specification Version 2.1,” Request for Comments: 8018, Internet Engineering Task Force, January 2017, https://tools.ietf.org/html/rfc8018.
11The Oxford English Dictionary, Oxford Dictionaries, accessed June 10, 2019, https://languages.oup.com/oed.
12An excellent introduction is Deborah J. Bennett, Randomness (Harvard University Press, 1998).
13Indeed, one of the most common methods of randomly generating keys for use in a cryptographic algorithm is to generate them using a (different) cryptographic algorithm.
14A typical formal requirement for a good cryptographic algorithm is that it be impossible to tell the difference between outputs of the cryptographic algorithm and those of a random number generator.
15Security products based on home-cooked cryptographic algorithms fall into the category that some cryptographers call snake oil: Bruce Schneier, “Snake Oil,” Crypto-Gram, February 15, 1999, https://www.schneier.com/crypto-gram/archives/1999/0215.html#snakeoil.
16This isn’t strictly true for applications not in the public domain. It would be perfectly reasonable for a government agency to choose to design a secret algorithm for its own internal use, as long as that agency had access to sufficient cryptographic expertise.
17For supporting public technologies, there has been a noticeable shift from secret to open cryptographic algorithm design over the last few decades, assisted by the development of open cryptographic standards.
18There are plenty of examples of secret algorithms in public technologies that have been successfully reverse engineered. The encryption algorithm A5/1 used in the GSM (Global System for Mobile Communications) standard is one such case.
19Auguste Kerckhoffs, “La cryptographie militaire,” Journal des sciences militaires 9 (January 1883): 5–83; and (February 1883): 161–91. English translation of the principles can be found in Fabien Petitcolas, “Kerckhoffs’ Principles from ‘La cryptographie militaire,’” Information Hiding Homepage, accessed June 10, 2019, http://petitcolas.net/kerckhoffs.
20The encryption algorithms used to protect telecommunications were secret in the GSM standard of the 1990s, but in more recent iterations, such as the 2008 LTE (Long-Term Evolution) standard, these are publicly specified.
21Merchandise 7X appears to remain a secret, despite claims otherwise over the years: William Poundstone, Big Secrets (William Morrow, 1985).
Chapter 3: Keeping Secrets
1Everyone needs confidentiality because everyone has something to hide. There is an oft-repeated mantra that if you have nothing to hide, then you should not worry about government surveillance programs. The fallacy of this argument is explored in detail in Daniel J. Solove, Nothing to Hide (Yale University Press, 2011); and David Lyon, Surveillance Studies: An Overview (Polity Press, 2007).
2Eric Hughes, “A Cypherpunk’s Manifesto,” March 9, 1993, https://www.activism.net/cypherpunk/manifesto.html.
3I use the phrase “should not fully trust” to instill a degree of caution about trusting devices and networks rather than to induce paranoia about rampant insecurity. The bottom line is that we can never be sure that our devices and networks have not been compromised in some way (such as through installation of malware), and it is thus prudent to be wary about trusting them completely.
4This argument is certainly contestable. The metadata relating to calling patterns can be very useful to investigators who do not have access to call content. Indeed, the utility of such metadata was made apparent through one of Edward Snowden’s revelations concerning the NSA’s collection of metadata from the US telecom provider Verizon: Glenn Greenwald, “NSA Collecting Phone Records of Millions of Verizon Customers Daily,” Guardian, June 6, 2013, https://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verizon-court-order.
5A good introduction to steganography is Peter Wayner, Disappearing Cryptography: Information Hiding: Steganography & Watermarking (MK/Morgan Kaufmann, 2009).
6For some real examples of steganography being used as a tool for attacking computers, see Ben Rossi, “How Cyber Criminals Are Using Hidden Messages in Image Files to Infect Your Computer,” Information Age, July 27, 2015, http://www.information-age.com/how-cyber-criminals-are-using-hidden-messages-image-files-infect-your-computer-123459881.
7Although this type of application is often discussed, and advice for how to deploy it can easily be found on the internet (for example, Krintoxi, “Using Steganography and Cryptography to Bypass Censorship in Third World Countries,” Cybrary, September 5, 2015, https://www.cybrary.it/0p3n/steganography-and-cryptography-to-bypass-censorship-in-third-world-countries), there is not much evidence that it is widely deployed. The reasons are probably similar to those outlined by critics of post-9/11 claims that steganography was heavily used by terrorists: Robert J. Bagnall, “Reversing the Steganography Myth in Terrorist Operations: The Asymmetrical Threat of Simple Intelligence Dissemination Techniques Using Common Tools,” SANS Institute, 2002, https://www.sans.org/reading-room/whitepapers/stenganography/reversing-steganography-myth-terrorist-operations-asymmetrical-threat-simple-intellig-556. In fact, since Bagnall’s 2002 comments, and particularly since Edward Snowden’s 2013 revelations, the range of methods of secure communications available to anyone wishing to avoid government surveillance has expanded.
8The Atbash cipher is an ancient Hebrew means of scrambling letters (indeed, the name Atbash derives from the first and last pairs of letters of the Hebrew alphabet). Some commentators believe that the biblical book of Jeremiah deploys the Atbash in several places: Paul Y. Hoskisson, “Jeremiah’s Game,” Insight 30, no. 1 (2010): 3–4, https://publications.mi.byu.edu/publications/insights/30/1/S00001-30-1.pdf.
9For a brief history and specification of Morse code, see Encyclopaedia Britannica, s.v. “Morse Code,” accessed July 21, 2019, https://www.britannica.com/topic/Morse-Code.
10For the history of deciphering Egyptian hieroglyphs, see Andrew Robinson, Cracking the Egyptian Code: The Revolutionary Life of Jean-François Champollion (Thames and Hudson, 2012).
11Dan Brown, The Da Vinci Code (Doubleday, 2003).
12Most modern uses of encryption are also accompanied by a separate cryptographic check that enables the receiver to detect whether the ciphertext has been modified in any way. Increasingly, these two processes are being combined through the use of special authenticated-encryption algorithms that provide both cryptographic services.
13My cryptographic colleague Steven Galbraith completely disagrees. He argues that Turing was sufficiently smart that if someone had suggested the idea of asymmetric encryption to him, Turing would have probably responded with: “Yes, of course!”
14Named after Blaise de Vigenère, this encryption algorithm was invented by Giovan Battista Bellaso in 1553. Widely believed at the time to be “unbreakable,” the Vigenère cipher is relatively easily decrypted, once the length of the key is determined—a process that can be conducted by statistical analysis of the ciphertext. A good explanation of both the algorithm and how to break it can be found in Simon Singh, The Code Book (Fourth Estate, 1999).
15For more details about the history and breaking of Enigma machines, see, for example, Hugh Sebag-Montefiore, Enigma: The Battle for the Code (Weidenfeld & Nicolson, 2004).
16“Data Encryption Standard (DES),” Federal Information Processing Standards, FIPS Publication 46, January 1977. This standard was subsequently revised several times and ultimately withdrawn in 2005. The last revised version, FIPS Publication 46-3, is archived at https://csrc.nist.gov/csrc/media/publications/fips/46/3/archive/1999-10-25/documents/fips46-3.pdf.
17Triple DES encryption essentially uses DES to encrypt the data with one key, decrypt it with a second key, and then encrypt the result with a third key (Triple DES decryption is the reverse process). Initially a quick fix for DES, Triple DES is still used by many applications, particularly in the financial sector. Details and recommendations for deployment of Triple DES can be found in Elaine Barker and Nicky Mouha, “Recommendation for the Triple Data Encryption Standard (TDEA) Block Cipher,” National Institute of Standards and Technology, NIST Special Publication 800-67, rev. 2, November 2017, https://doi.org/10.6028/NIST.SP.800-67r2.
18“Specification for the Advanced Encryption Standard (AES),” Federal Information Processing Standards, FIPS Publication 197, November 26, 2001, https://nvlpubs.nist.gov/nistpubs/fips/nist.fips.197.pdf.
19A historical overview of the AES process, including relevant documentation, can be found in “AES Development,” NIST Computer Security Resource Center, updated October 10, 2018, https://csrc.nist.gov/projects/cryptographic-standards-and-guidelines/archived-crypto-projects/aes-development.
20All the AES operations are conducted on a square of bytes—it is no coincidence that the original encryption algorithm from which the AES was developed was called Square.
21The AES design process lasted almost four years, with fifteen candidate designs eventually whittled down to one, following an intense evaluation process that included three dedicated conferences. The precise details behind the design of AES are documented in Joan Daemen and Vincent Rijmen, The Design of Rijndael (Springer, 2002).
22The block ciphers indirectly referred to here include BEAR, Blowfish, Cobra, Crab, FROG, Grand Cru, LION, LOKI, Red Pike, Serpent, SHARK, Skipjack, Twofish, and Threefish.
23NIST provides a list of some recommended modes of operation, including those for confidentiality only (CBC, CFB, ECB, OFB), authentication only (CMAC), authenticated encryption (CCM, GCM), disk encryption (XTS), and protection of cryptographic keys (KW, KWP): “Block Cipher Techniques—Current Modes,” NIST Computer Security Resource Center, updated May 17, 2019, https://csrc.nist.gov/Projects/Block-Cipher-Techniques/BCM/Current-Modes.
24It’s not quite the chicken-or-the-egg dilemma, because using encryption is not the only conceivable means by which a secret key can be distributed to someone. It’s just the most obvious means, and one that is often used in practice.
25The security of Wi-Fi networks has a somewhat checkered history. The main security standards are covered in the IEEE 802.11 series. These standards effectively restrict access to authorized devices and enable communications on a Wi-Fi network to be encrypted. Other related standards, such as WPS (Wi-Fi Protected Setup), are designed to make it easier to initialize keys on a Wi-Fi network.
Chapter 4: Sharing Secrets with Strangers
1“Total Number of Websites,” Internet Live Stats, accessed June 10, 2019, http://www.internetlivestats.com/total-number-of-websites.
2The trusted-center scenario for key distribution can work very well in environments that are centralized and have obvious trust points. For example, the network authentication system Kerberos works this way: “Kerberos: The Network Authentication Protocol,” MIT Kerberos, updated January 9, 2019, https://web.mit.edu/kerberos.
3There are many good online videos of the subsequent process for using padlocks to exchange a secret—for example: Chris Bishop, “Key Exchange,” YouTube, June 9, 2009, https://www.youtube.com/watch?v=U62S8SchxX4.
4A function suitable for asymmetric encryption is sometimes called a trapdoor one-way function. “One-way” refers to the fact it must be easy to compute but hard to reverse, while “trapdoor” indicates there must be a way for the genuine recipient to reverse the process (knowledge of the private decryption key being the trapdoor).
5Computational complexity theory is concerned with classifying computational problems according to their difficulty. A good textbook focusing on the relationship between computational complexity and cryptography is John Talbot and Dominic Welsh, Complexity and Cryptography: An Introduction (Cambridge University Press, 2006).
6The history of the study of primes and why they are so significant both to mathematics and to other fields of study is discussed in Marcus du Sautoy, The Music of the Primes: Why an Unsolved Problem in Mathematics Matters (HarperPerennial, 2004).
7Ron Rivest, Adi Shamir, and Len Adleman, “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems,” Communications of the ACM 21, no. 2 (1978): 120–26.
8A list of the world’s fastest supercomputers is periodically updated at “TOP500 Lists,” TOP500.org, accessed July 21, 2019, https://www.top500.org/lists/top500.
923,189 is the 2,587th prime. If you are unconvinced and don’t want to check this, I recommend you visit Andrew Booker, “The Nth Prime Page,” accessed June 10, 2019, https://primes.utm.edu/nthprime.
10The NIST recommendations for data that needs protection up until the year 2030 suggest using a product of two primes that is more than 3,000 bits long (see “Recommendation for Key Management,” National Institute of Standards and Technology, NIST Special Publication 800-57, Part 1, rev. 4, 2016). Such a number is more than 900 decimal digits long, and RSA uses primes of roughly equal size, making each of the two primes more than 450 decimal digits long.
11The mathematical knowledge required to appreciate how RSA works is a basic understanding of modular arithmetic and the Fermat-Euler theorem, both of which should be familiar to anyone who has studied an introduction to number theory. Many introductory cryptography textbooks also explain the minimum mathematics required, including Keith M. Martin, Everyday Cryptography, 2nd ed. (Oxford University Press, 2017).
12This remark is a combination of fact and slightly facetious speculation. The fact relates to the time required to factor a number of this size on conventional computers. It is believed that this factoring would take approximately the same time as a search for a 128-bit key, which is something I later argue would need about 50 million billion years. The speculation is, of course, that Homo sapiens may not stick around for that length of time. It is believed that Homo sapiens has existed for about 300,000 years, so projecting this far into the future is guesswork. A range of possible futures for our species is discussed in Jolene Creighton, “How Long Will [It] Take Humans to Evolve? What Will We Evolve Into?” Futurism, December 12, 2013, https://futurism.com/how-long-will-take-humans-to-evolve-what-will-we-evolve-into.
13A table that sorts block ciphers into categories relating to their frequency of use (common, less common, and other) can be found at the bottom of Wikipedia, s.v. “Block Cipher,” accessed June 10, 2019, https://en.wikipedia.org/wiki/Block_cipher.
14The threat presented by quantum computers, which I discuss later, has spurred a major international effort to find new asymmetric encryption algorithms based on new hard problems: “Post-quantum Cryptography,” NIST Computer Security Resource Center, updated June 3, 2019, https://csrc.nist.gov/Projects/Post-Quantum-Cryptography. However, even this process involves only a handful of fundamentally different problems around which candidate algorithms are based.
15The history of the development of asymmetric (public-key) encryption is fascinating. The earliest discovery is now attributed to researchers at GCHQ, who were seeking a solution to the problem of distributing secret keys around a network. The conceptual idea behind asymmetric encryption was set out by James Ellis in a 1969 document, although not instantiated until Clifford Cocks proposed a real scheme in 1973. Only in 1997, however, did these discoveries come to public light. In the meantime, a similar process occurred in the public space, with Whitfield Diffie and Martin Hellman conceptualizing the idea in 1976 and with a number of researchers later proposing instantiations, including the RSA algorithm formulated by Rivest, Shamir, and Adleman in 1977. For more information, see James Ellis, “The History of Non-secret Encryption,” Cryptologia 23, no. 3 (1999): 267–73; Whitfield Diffie and Martin Hellman, “New Directions in Cryptography,” IEEE Transactions on Information Theory 22, no. 6 (1976): 644–54; and Steven Levy, Crypto: Secrecy and Privacy in the New Cold War (Penguin, 2000).
16While factoring and finding discrete logarithms over modular numbers are broadly believed to be equally difficult, the advantage of basing an asymmetric encryption algorithm on elliptic curves is that finding discrete logarithms over elliptic curves is believed to be a magnitude more difficult, which allows the keys used for elliptic-curve-based encryption to be shorter than those for RSA. The mathematics behind elliptic curves is straightforward for those with a numerical background, but it is otherwise not for the fainthearted. Most mathematical introductions to cryptography explain all you need to know; for example, see Douglas R. Stinson and Maura B. Paterson, Cryptography: Theory and Practice, 4th ed. (CRC Press, 2018).
17A well-documented example of this problem concerns an attack in which 300,000 Iranian citizens believed they were communicating, by computer, with Google Gmail servers, when in fact they had been presented with alternative public keys that connected them to an attack site, which was then used to monitor their communications. This attack happened because a company called DigiNotar, which issued certified public keys, was itself hacked in order to create the public keys that fooled the Iranian Gmail users. See, for example, Gregg Keizer, “Hackers Spied on 300,000 Iranians Using Fake Google Certificate,” Computerworld, September 6, 2011, https://www.computerworld.com/article/2510951/cybercrime-hacking/hackers-spied-on-300-000-iranians-using-fake-google-certificate.html.
18Laurie Lee’s iconic novel Cider with Rosie (Hogarth Press, 1959) is based on his childhood in the 1920s, describing life in a small English village before the arrival of transformational technology such as the motor car. It represents an apparently lost rural idyll, free from the pressures of time and connectivity to the outside world.
19Examples of important internet standards that all use hybrid encryption include Transport Layer Security (TLS) for secure web connections, Internet Protocol Security (IPSec) for establishing virtual private networks to enable activities such as working from home, Secure Shell (SSH) for secure file transfer, and Secure Multipurpose Internet Mail Extensions (S/MIME) for secure email.
20The research and advisory company Gartner is associated with a simple methodology known as the Gartner Hype Cycle for tracking expectations concerning new technologies (see “Gartner Hype Cycle,” Gartner, accessed June 10, 2019, https://www.gartner.com/en/research/methodologies/gartner-hype-cycle). This cycle is characterized by an early peak of exaggerated and often poorly informed interest, then a rapid decline as the realities of implementation kick in, followed by a gentle rise as the true niches for the technology become understood. Asymmetric encryption has probably now journeyed to the “plateau of productivity,” where its advantages and disadvantages are understood well enough that it is deployed appropriately.
Chapter 5: Digital Canaries
1Studies in cognitive psychology indicate that most people prefer to avoid losses rather than to acquire gains of an equivalent amount. This loss aversion is one of a number of cognitive biases brought to popular attention in Daniel Kahneman, Thinking Fast and Slow (Penguin, 2012).
2Note that data integrity is only about detecting errors, not correcting them. Separate mathematical techniques known as error-correcting codes enable a degree of automatic correction of errors. These are not normally regarded as security techniques, and they are used in applications where errors are expected but we don’t want to be made aware of their existence, such as when we’re listening to digital music.
3Until the mid-1980s, miners in the UK and other countries deployed caged canaries to detect the presence of toxic gases—a practice ultimately replaced by digital detectors: Kat Eschner, “The Story of the Real Canary in the Coal Mine,” Smithsonian, December 30, 2016, https://www.smithsonianmag.com/smart-news/story-real-canary-coal-mine-180961570.
4The term fake news is often associated with Donald Trump, who used the term to describe negative press coverage during his run for the presidency. However, the intentional spreading of information (accurate or otherwise) is an ancient craft. What is relevant to our discussion is that digital media make it easier and faster to distribute such information.
5There is evidence that people have more trouble identifying fake news when it is spread via digital media: Simeon Yates, “‘Fake News’—Why People Believe It and What Can Be Done to Counter It,” Conversation, December 13, 2016, https://theconversation.com/fake-news-why-people-believe-it-and-what-can-be-done-to-counter-it-70013.
6“Integrity,” Lexico, accessed June 12, 2019, https://www.lexico.com/en/definition/integrity.
7Notably, these trust links fade quickly. If you trust your friend Charlie, who in turn trusts his friend Diane, then to what extent do you trust Diane? Maybe you will trust her for some specific things, but you are unlikely to trust many of Diane’s other friends; the links quickly become somewhat tenuous. This fading of trust as links become more and more distant has potential ramifications in cyberspace, where, for example, enthusiastic users of social media can rapidly assemble legions of alleged “friends.”
8MD5 is a cryptographic hash function invented by Ronald Rivest (the “R” of RSA) in 1991. The value it outputs is 128 bits long. MD5 is fully specified in “The MD5 Message-Digest Algorithm,” Request for Comments: 1321, Internet Engineering Task Force, April 1992, https://tools.ietf.org/html/rfc1321. Note that serious weaknesses have subsequently been discovered in MD5; see “Updated Security Considerations for the MD5 Message-Digest and the HMAC-MD5 Algorithms,” Request for Comments: 6151, Internet Engineering Task Force, March 2011, https://tools.ietf.org/html/rfc6151.
9The use of seals for this purpose is as old as civilization itself, with ancient stone seals used to make impressions in clay being a significant archaeological artifact for historians: Marta Ameri et al., Seals and Sealing in the Ancient World (Cambridge University Press, 2018).
10First developed in 1970, a modern ISBN consists of thirteen digits, including identifiers for the country of origin and publisher of the book. You can submit an ISBN to obtain full details of the book at ISBN Search, accessed June 10, 2019, https://isbnsearch.org.
11The majority of these examples use an algorithm named after Hans Peter Luhn, who patented it in 1960. The Luhn algorithm for computing a check digit is similar to, but different from, that used for the ISBN: Wikipedia, s.v. “Luhn Algorithm,” accessed June 10, 2019, https://en.wikipedia.org/wiki/Luhn_algorithm.
12Confusingly, the term hash function is used in the field of computer science for several different purposes. I will restrict my use of the term to what are sometimes known as cryptographic hash functions.
13I previously mentioned the hash function MD5, which is often used for integrity checking of downloaded files. Other examples of practically deployed hash functions include SHA-1, the SHA-2 family, and the SHA-3 family, the latter of which was selected in 2015 as the winner of an international competition held by the US National Institute of Standards and Technology: “Hash Functions,” NIST Computer Security Resource Center, updated May 3, 2019, https://csrc.nist.gov/Projects/Hash-Functions.
14The problem is not the basic idea, but the way that many hash functions are designed. To put it crudely, it is common for a hash function to input some of the data, compress it, input a bit more, compress it, and so on. Therefore, appending the key to the end of the data means that the key will not be mixed in with the data as well as it could be.
15“HMAC: Keyed-Hashing for Message Authentication,” Request for Comments: 2104, Internet Engineering Task Force, February 1997, https://tools.ietf.org/html/rfc2104.
16“The AES-CMAC Algorithm,” Request for Comments: 4493, Internet Engineering Task Force, June 2006, https://tools.ietf.org/html/rfc4493.
17There are many different reasons why authenticated encryption modes, which combine encryption and MAC computation, offer advantages over encrypting and adding a MAC separately. Some of these relate to efficiency, but the most compelling apply to security. In essence, certain things can go wrong with their integration when the two operations are conducted separately, and these problems can be avoided if an approved authenticated encryption mode is used. Examples of authenticated encryption modes include CCM (“Recommendation for Block Cipher Modes of Operation: The CCM Mode for Authentication and Confidentiality,” NIST Special Publication 800-38C, July 20, 2007) and GCM (“Recommendation for Block Cipher Modes of Operation: Galois/Counter Mode [GCM] and GMAC,” NIST Special Publication 800-38D, November 2007).
18This argument assumes no auxiliary evidence, such as a secure network log entry demonstrably proving that the MAC was sent over a network whose origin was the sender’s internet address.
19Digital signatures would also be insecure. If you wanted to digitally sign a very long document, then it would need to be broken up into independent chunks of data, each signed separately. An attacker could intercept this string of separate chunks of data and their accompanying signatures and swap chunks around, along with their signatures. The result would appear to be a valid set of data chunks and signatures. In reality, however, the combined message would be incorrectly ordered.
20In many ways, the longevity and ubiquity of handwritten signatures are surprising, and testament to their convenience. Even as we move toward increased use of digital documents, the handwritten signature seems to prevail through the widespread acceptance of digital scans of handwritten signatures. The digital scan of a handwritten signature is just a small image file that is easily extracted from a document; it is, in this regard, an even weaker mechanism than a handwritten signature.
21Reporters Without Borders produces the World Press Freedom Index, which bases its results on analyses of media independence, self-censorship, legislation, transparency, and quality of media infrastructure. North Korea, which regards listening to or viewing media content that originates outside of the country as a criminal offense, is consistently close to the bottom of the table: “North Korea,” Reporters Without Borders, accessed June 10, 2019, https://rsf.org/en/north-korea.
22This idea was brought to popular attention in Eli Pariser, The Filter Bubble (Penguin, 2012).
23In my experience, the quality of information on Wikipedia concerning cryptography is pretty good. This reliability is probably indicative of both a strong interest in cryptography across the internet community, and perhaps a high correlation between people interested in cryptography and people with the will (and/or capability) to edit Wikipedia pages.
24Money being moved elsewhere is, of course, precisely what happens when confidence is lost in a bank, such as during the 2007 collapse of the UK bank Northern Rock: Dominic O’Connell, “The Collapse of Northern Rock: Ten Years On,” BBC, September 12, 2017, https://www.bbc.co.uk/news/business-41229513.
25A plethora of information about Bitcoin is available. An excellent background on the need for (and use of) Bitcoin is Dominic Frisby, Bitcoin: The Future of Money? (Unbound, 2015). A readable introduction to the cryptography used in Bitcoin is Andreas M. Antonopoulos, Mastering Bitcoin: Unlocking Digital Cryptocurrencies (O’Reilly, 2014).
26There have been many examples of attempts to facilitate certain aspects of digital cash through the centralized banking system. These include 1990s digital-wallet technologies such as Mondex and Proton, and more recently the likes of Apple Pay. While these all offer some of the convenience of cash, they remain linked to traditional bank accounts.
27Banks pioneered the commercial use of cryptography in the 1970s. The motivation and success of the Data Encryption Standard (DES) was due largely to the need for digital security in the financial sector.
28One of the many clever features of Bitcoin is that it has a parameter that can be adjusted to control the frequency of block creation.
29Somewhat against the decentralization spirit behind Bitcoin, the profitability of bitcoin mining has led to the development of enormous processing centers dedicated solely to mining bitcoin. These are sometimes referred to as bitcoin farms: Julia Magas, “Top Five Biggest Crypto Mining Areas: Which Farms Are Pushing Forward the New Gold Rush?” Cointelegraph, June 23, 2018, https://cointelegraph.com/news/top-five-biggest-crypto-mining-areas-which-farms-are-pushing-forward-the-new-gold-rush.
30This is often referred to as a fork in the blockchain.
31For a full list of current cryptocurrencies, see “Cryptocurrency List,” CoinLore, accessed June 10, 2019, https://www.coinlore.com/all_coins.
Chapter 6: Who’s Out There?
1See, for example, Michael Cavna, “‘Nobody Knows You’re a Dog’: As Iconic Internet Cartoon Turns 20, Creator Peter Steiner Knows the Joke Rings as Relevant as Ever,” Washington Post, July 31, 2013.
2Facebook reported to the US Securities and Exchange Commission that it made $20.21 from each of its 1.4 billion users in 2017: Julia Glum, “This Is Exactly How Much Your Personal Information Is Worth to Facebook,” Money, March 21, 2018, http://money.com/money/5207924/how-much-facebook-makes-off-you.
3For a discussion about common threats to passports and the security techniques used to counter them, see “Passport Security Features: 2019 Report Anatomy of a Secure Travel Document,” Gemalto, updated May 20, 2019, https://www.gemalto.com/govt/travel/passport-security-design.
4This is why we use a variety of security mechanisms with our mobile phones. The mobile phone company uses security mechanisms on the SIM card to identify the account holder. The phone owner typically uses a PIN or password to control who can use the phone.
5While it is technically possible to install software on a phone that could conduct bank fraud, more common attacks on mobile banking involve criminals stealing phone numbers or linking different mobile phone accounts to a target’s bank account. See, for example, Miles Brignall, “Mobile Banking in the Spotlight as Fraudsters Pull £6,000 Sting,” Guardian, April 2, 2016, https://www.theguardian.com/money/2016/apr/02/mobile-banking-fraud-o2-nationwide; and Anna Tims, “‘Sim Swap’ Gives Fraudsters Access-All-Areas via Your Mobile Phone,” Guardian, September 26, 2015, https://www.theguardian.com/money/2015/sep/26/sim-swap-fraud-mobile-phone-vodafone-customer.
6Alan Turing introduced the famous Turing test, which is designed to distinguish the behavior of a computer from that of a human: Alan M. Turing, “Computing Machinery and Intelligence,” Mind 59, no. 236 (October 1950): 433–60.
7This type of malware is often referred to as a keylogger. A good introduction to the topic is Nikolay Grebennikov, “Keyloggers: How They Work and How to Detect Them,” SecureList, March 29, 2007, https://securelist.com/keyloggers-how-they-work-and-how-to-detect-them-part-1/36138.
8Captchas are fairly unpopular because they waste time and are easy to get wrong, resulting in further delays. A discussion of some alternative approaches can be found in Matt Burgess, “Captcha Is Dying. This Is How It’s Being Reinvented for the AI Age,” Wired, October 26, 2017, https://www.wired.co.uk/article/captcha-automation-broken-history-fix.
9A good introduction to biometrics is John R. Vacca, Biometric Technologies and Verification Systems (Butterworth-Heinemann, 2007).
10A famous example of biometrics being “stolen” is the case of so-called gummy fingers, which are artificial fingers designed to fool fingerprint recognition systems: Tsutomu Matsumoto et al., “Impact of Artificial ‘Gummy’ Fingers on Fingerprint Systems,” Proceedings of SPIE 4677 (2002), https://cryptome.org/gummy.htm.
11Banks could be more thorough—for example, if every personal device had a card reader that could detect the physical card rather than just the embossed card data. Like all security measures, however, this is an issue of striking a balance between security, cost, and usability.
12Levels of card-not-present fraud around the world are reviewed in “Card-Not-Present Fraud around the World,” US Payments Forum, March 2017, https://www.uspaymentsforum.org/wp-content/uploads/2017/03/CNP-Fraud-Around-the-World-WP-FINAL-Mar-2017.pdf. For example, card-not-present fraud accounted for 69 percent of card fraud in the UK in 2014, and 76 percent of card fraud in Canada in 2015.
13Authentication and authorization are related concepts and often confused. Authentication is primarily about establishing who is out there. Authorization concerns what someone is permitted to do. When you log on to your social media account, you are authenticating. The social media platform then uses the process of authorization to determine which data you are allowed to view. Although authorization often follows authentication, it does not necessarily require it. A supermarket assistant authorizes the sale of alcohol by determining the age of a shopper—either by looking at the person or by demanding age-verifying evidence—without requiring knowledge of who they are. Cryptography provides tools for authentication. While cryptography can be used to support it, authorization is commonly managed by other means (for example, through rules governing access to entries in a database).
14This technique is no longer reliable, given the powerful digital editing software that is freely available.
15Elizabeth Stobert, “The Agony of Passwords,” in CHI ’14 Extended Abstracts on Human Factors in Computing Systems (ACM, 2014), 975–80.
16Advice on managing passwords is often contradictory because difficult trade-offs must be made. For example, regular password change reduces the impact of a compromise, but it also complicates life for password users and may push them toward unsafe practices, such as writing passwords down. For general guidance on password management, see, for example, “Password Administration for System Owners,” National Cyber Security Centre, November 19, 2018, https://www.ncsc.gov.uk/collection/passwords.
17Perhaps even worse than an individual breach, the results of such attacks appear to be aggregating in enormous repositories of stolen passwords and accompanying credentials: Mohit Kumar, “Collection of 1.4 Billion Plain-Text Leaked Passwords Found Circulating Online,” Hacker News, December 12, 2017, https://thehackernews.com/2017/12/data-breach-password-list.html.
18In 2019, Facebook acknowledged that a bug in its password management systems had resulted in hundreds of millions of user passwords being stored unencrypted on an internal platform: Lily Hay Newman, “Facebook Stored Millions of Passwords in Plaintext—Change Yours Now,” Wired, March 21, 2019, https://www.wired.com/story/facebook-passwords-plaintext-change-yours.
19There are many sources of advice about how to select strong passwords. One example is the guidelines from the National Institute of Standards and Technology, which are summarized in Mike Garcia, “Easy Ways to Build a Better P@$5w0rd,” NIST, Taking Measure (blog), October 4, 2017, https://www.nist.gov/blogs/taking-measure/easy-ways-build-better-p5w0rd.
20This attitude is not new. A colleague of mine was informed by a systems engineer in the 1980s that “cryptography is nothing more than an expensive way of degrading performance.”
21Examples of key-stretching algorithms include PBKDF2 and Argon2.
22For a UK government perspective on the value of password managers, see Emma W., “What Does the NCSC Think of Password Managers?” National Cyber Security Centre, January 24, 2017, https://www.ncsc.gov.uk/blog-post/what-does-ncsc-think-password-managers.
23The “2016 Data Breach Investigations Report” by Verizon claimed that 63 percent of confirmed data breaches exploited passwords that had either been poorly generated, unchanged from default, or stolen. The latest Verizon report can be downloaded from Verizon at https://www.verizonenterprise.com/verizon-insights-lab/dbir.
24A range of examples of phishing scams, as well as advice about how to detect phishing attacks and avoid falling for them, can be found at Phishing.org, accessed August 4, 2019, https://www.phishing.org.
25A substantial body of evidence suggests that mandating regular password changes can be unproductive: Lorrie Craynor, “Time to Rethink Mandatory Password Changes,” Federal Trade Commission, March 2, 2016, https://www.ftc.gov/news-events/blogs/techftc/2016/03/time-rethink-mandatory-password-changes.
26While online-banking authentication tokens remain in widespread use, they are relatively expensive to implement. An alternative solution is to utilize devices capable of cryptographic computation that customers already possess, which is why banks are increasingly supporting the use of apps running on mobile phones for authentication. Another approach is to use keys that customers already possess, which is why some banks issue card readers capable of communicating with keys stored on the chip on the customer’s bank card.
27Predictive algorithms can be used to monitor the lag between an individual token and the master clock on which the system is based. When a customer attempts to authenticate, the bank uses the predictive algorithm to estimate the time that’s on the token’s clock, on the basis of past interactions with that token. The bank could also choose to consider any time within a small time window to be acceptable.
28Numerous well-publicized attacks have targeted car key systems. Some of these have been possible because a car manufacturer didn’t adopt “perfect passwords” and instead used default passwords common to all cars of a certain type. However, even those using perfect passwords have come unstuck through variants of relay attacks, in which an attacker with a special radio device positions themselves in the middle of an attacker-initiated conversation between a car (sitting on a driveway) and a key (hanging in the hallway of a house); see, for example, David Bisson, “Relay Attack against Keyless Vehicle Entry Systems Caught on Film,” Tripwire, November 29, 2017, https://www.tripwire.com/state-of-security/security-awareness/relay-attack-keyless-vehicle-entry-systems-caught-film.
29Not all boomerangs are designed to come back. In this hunting scenario, the boomerang is intended to fly behind the ducks and scare them into flight toward the hunter; hence, the boomerang is not really required to return to the hunter’s hand. Never mind—I only want an analogy! Boomerang purists are strongly encouraged to read Philip Jones, Boomerang: Behind an Australian Icon (Wakefield Press, 2010).
30Melaleuca quinquenervia is a tree, native to Southeast Asia and Australia, that has been introduced throughout the world both as an ornamental tree and for draining wetlands. It has a strong-scented blossom whose fragrance is not always appreciated.
31A similar principle lies behind identification, friend or foe (IFF) systems, first designed in the 1930s to address the problem of establishing whether an approaching aircraft was an ally or an enemy. For a historical review, see Lord Bowden, “The Story of IFF (Identification Friend or Foe),” IEE Proceedings A (Physical Science, Measurement and Instrumentation, Management and Education, Reviews) 132, no. 6 (October 1985): 435–37.
32The most recent version of TLS is specified in “The Transport Layer Security (TLS) Protocol Version 1.3,” Request for Comments: 8446, Internet Engineering Task Force, August 2018, https://tools.ietf.org/html/rfc8446.
33The case for anonymity being regarded as a fundamental human right is made in Jillian C. York, “The Right to Anonymity Is a Matter of Privacy,” Electronic Frontier Foundation, January 28, 2012, https://www.eff.org/deeplinks/2012/01/right-anonymity-matter-privacy.
34An introduction to the different ways that human behavior appears to change in cyberspace is Mary Aiken, The Cyber Effect (John Murray, 2017).
35A good resource for explanations about the threats that harassing behaviors present and how to address them is “Get Safe Online—Free Expert Advice,” Get Safe Online, accessed June 10, 2019, https://www.getsafeonline.org.
36The data we leave in our tracks as we conduct activities in cyberspace is sometimes known as our digital footprint. One of the best ways of understanding this concept is to appreciate how investigators attempt to reconstruct activities in cyberspace by using digital forensics. A good introduction is John Sammons, The Basics of Digital Forensics (Syngress, 2014).
37Tor is free software that can be downloaded from https://www.torproject.org.
38A fascinating investigation into some of the more sinister activities in cyberspace facilitated by anonymity is Jamie Bartlett, The Dark Net (Windmill, 2015).
39Many of the original pioneers of the internet regarded cyberspace as a new world free from the constraints of established society. The ability to remain anonymous in cyberspace was key to realizing this vision, as documented in Thomas Rid, Rise of the Machines (W. W. Norton, 2016).
40Andrew London, “Elon Musk’s Neuralink—Everything You Need to Know,” TechRadar, October 19, 2017, https://www.techradar.com/uk/news/neuralink.
Chapter 7: Breaking Cryptosystems
1Even when nuts and bolts are blamed for the failure of a bridge, the reason is often that they were inappropriately used. For example, the failure of a 2016 bridge in Canada was blamed on the overloading of bolts, not on the bolts themselves: Emily Ashwell, “Overloaded Bolts Blamed for Bridge Bearing Failure,” New Civil Engineer, September 28, 2016, https://www.newcivilengineer.com/world-view/overloaded-bolts-blamed-for-bridge-bearing-failure/10012078.article. As I will discuss later, inappropriate use of cryptographic algorithms is a potential reason for failure of a cryptosystem.
2Caesar’s use of encryption is described in C. Suetonius Tranquillus, “De vita Caesarum,” 121. A translation is available from “The Lives of the Twelve Caesars, Complete by Suetonius,” Project Gutenberg, accessed June 10, 2019, https://www.gutenberg.org/files/6400/6400-h/6400-h.htm (see Caius Julius Caesar Clause 56 for discussion of encryption).
3For more information about Mary’s ciphers and the Babington Plot to oust Elizabeth I, see “Mary, Queen of Scots (1542–1587),” National Archives (UK), accessed June 10, 2019, http://www.nationalarchives.gov.uk/spies/ciphers/mary. Mary’s use of encryption is also discussed in Simon Singh, The Code Book (Fourth Estate, 1999).
4Further details of Elizabeth I’s sophisticated espionage agency can be found in Robert Hutchinson, Elizabeth’s Spy Master: Francis Walsingham and the Secret War That Saved England (Weidenfeld & Nicolson, 2007).
5For example, ISO/IEC 18033 is a multipart standard that specifies a range of encryption algorithms: “ISO/IEC 18033 Information Technology—Security Techniques—Encryption Algorithms,” International Organization for Standardization.
6Bruce Schneier has, over the years, “outed” a long list of mis-sold substandard cryptographic products, which he refers to as cryptographic “snake oil,” in the archives of his Crypto-Gram newsletter: Schneier on Security, accessed August 4, 2019, https://www.schneier.com/crypto-gram.
7According to Donald Rumsfeld: “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.” Full transcript available from Donald H. Rumsfeld, “DoD News Briefing—Secretary Rumsfeld and Gen. Myers,” US Department of Defense, February 12, 2002, http://archive.defense.gov/Transcripts/Transcript.aspx?TranscriptID=2636.
8While the NSA appeared to shorten the DES key length, it is believed that the agency strengthened the algorithm itself by optimizing it against an attack technique known as differential cryptanalysis, which was not discovered by the public research community until the 1980s. See, for example, Peter Bright, “The NSA’s Work to Make Crypto Worse and Better,” Ars Technica, June 9, 2013, https://arstechnica.com/information-technology/2013/09/the-nsas-work-to-make-crypto-worse-and-better; for more details, see Don Coppersmith, “The Data Encryption Standard (DES) and Its Strength against Attacks,” IBM Journal of Research and Development 38, no. 3 (1994): 243–50.
9This follows from the fact that the playing field is not level. The intelligence community employs many cryptographers and has access to everything the public community publishes. However, the intelligence community only rarely shares its knowledge. The intelligence community thus must know more about cryptography. The question is: Does it know anything significant that is not known by the public community? And how would we ever find out?
10The most famous technique for predicting computing power is Moore’s law. Intel’s Gordon Moore proposed a rule of thumb that the number of components on an integrated circuit would approximately double every two years. While this estimate was believed to be fairly accurate for several decades, it now looks unlikely to be the best gauge of future progress: M. Mitchell Waldrop, “The Chips Are Down for Moore’s Law,” Nature, February 9, 2016, https://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338.
11Xiaoyun Wang et al., “Collisions for Hash Functions MD4, MD5, HAVAL-128 and RIPEMD,” Cryptology ePrint Archive 2004/199, rev. August 17, 2004, https://eprint.iacr.org/2004/199.pdf.
12Intriguingly, a machine that can decrypt ciphertext without knowledge of the algorithm that was used features in Dan Brown’s novel Digital Fortress (St. Martin’s Press, 1998).
13Exhaustive key searches are sometimes known as brute force attacks.
14I based this crude calculation on an analysis similar to that provided in Mohit Arora, “How Secure Is AES against Brute Force Attacks?” EE Times, May 7, 2012, https://www.eetimes.com/document.asp?doc_id=1279619.
15This estimate is from Whitfield Diffie and Martin E. Hellman, “Exhaustive Cryptanalysis of the NBS Data Encryption Standard,” Computer 10 (1977): 74–84.
16The DESCHALL Project was the first winner of a set of challenges issued by the cybersecurity company RSA Security in 1997, winning a $10,000 prize for successfully conducting an exhaustive search for a DES key. The full story of the project is told in Matt Curtin, Brute Force (Copernicus, 2005).
17See Sarah Giordano, “Napoleon’s Guide to Improperly Using Cryptography,” Cryptography: The History and Mathematics of Codes and Code Breaking (blog), accessed June 10, 2019, http://derekbruff.org/blogs/fywscrypto.
18Much has been written about the cryptanalysis of the Enigma machines. One of the most detailed and authoritative sources is Władysław Kozaczuk, Enigma: How the German Machine Cipher Was Broken, and How It Was Read by the Allies in World War Two (Praeger, 1984).
19These techniques include encrypting each plaintext block along with a counter that increments after each encryption, and encrypting each plaintext block along with the previous ciphertext block (which is essentially a random number). For details, see “Block Cipher Techniques—Current Modes,” NIST Computer Security Resource Center, updated May 17, 2019, https://csrc.nist.gov/Projects/Block-Cipher-Techniques/BCM/Current-Modes.
20You can see a copy of this message at “The Babington Plot,” Secrets and Spies, National Archives (UK), accessed June 10, 2019, http://www.nationalarchives.gov.uk/spies/ciphers/mary/ma2.htm.
21Indeed, some phishing attacks work this way by offering you an apparently secure web link to a website address that closely resembles a genuine one (such as your bank’s address) but is, in fact, the attacker’s website. By using a TLS connection, the attacker is now able to use cryptography to prevent any other attacker from viewing or modifying the data that you now mistakenly send to the attack website!
22RC4 is no longer regarded as secure enough for most modern applications of cryptography: John Leyden, “Microsoft, Cisco: RC4 Encryption Considered Harmful, Avoid at All Costs,” Register, November 14, 2013, https://www.theregister.co.uk/2013/11/14/ms_moves_off_rc4.
23The details of cryptographic weaknesses in WEP are widely available. See, for example, Keith M. Martin, Everyday Cryptography, 2nd ed. (Oxford University Press, 2017), 488–95.
24The WEP protocol for Wi-Fi security was first upgraded to a protocol called WPA (Wi-Fi Protected Access), then in 2004 to a securer version called WPA2, which became the default security protocol for Wi-Fi. In 2018 it was announced that WPA2 itself was to be upgraded to WPA3, although the rollout of this latest version is expected to take many years to complete, since, in many cases, the protocol will be upgraded only when equipment is replaced.
25Bruce Schneier, “Why Cryptography Is Harder Than It Looks,” Information Security Bulletin, 1997, Schneier on Security, https://www.schneier.com/essays/archives/1997/01/why_cryptography_is.html.
26“Keynote by Mr. Thomas Dullien—CyCon 2018,” NATO Cooperative Cyber Defence Centre of Excellence (CCDCOE), YouTube, June 20, 2018, https://www.youtube.com/watch?v=q98foLaAfX8.
27Paul C. Kocher, “Announce: Timing cryptanalysis of RSA, DH, DSS,” sci.crypt, December 11, 1995, https://groups.google.com/forum/#!msg/sci.crypt/OvUlewbjfa8/a1kP6WjW1lUJ.
28Paul C. Kocher, “Timing Attacks on Implementations of Diffie-Hellman, RSA, DSS, and Other Systems,” in Proceedings of the 16th Annual International Cryptology Conference on Advances in Cryptology, Lecture Notes in Computer Science 1109 (Springer, 1996), 104–13.
29It turns out that the intelligence community was aware of some of the threats posed by side channels much earlier, as shown by a 1972 article that was declassified in 2007: “TEMPEST: A Signal Problem,” NSA Cryptologic Spectrum 2, no. 3 (Summer 1972): 26–30, https://www.nsa.gov/news-features/declassified-documents/cryptologic-spectrum/assets/files/tempest.pdf.
30These sketches could easily come from a James Bond movie. Spanish actor Javier Bardem played the villain in the 2012 Bond movie Skyfall (directed by Sam Mendes, Columbia Pictures, 2012).
31This is just the same as for physical keys. Looking after a front-door key is easier than protecting an entire property. Safeguarding the key might not be as strong a security measure as employing a team of guards and vicious dogs, but it’s a pragmatic substitute.
32Use of default passwords is more common than many people realize: “Risks of Default Passwords on the Internet,” Department of Homeland Security, June 24, 2013, https://www.us-cert.gov/ncas/alerts/TA13-175A.
33The website does not explicitly offer you the public key, of course; this happens in the background, and it’s your web browser that receives the offered key on your behalf. You can, however, choose to take a look at this key by selecting appropriate browser settings.
34You can have a lot of fun, and waste an enormous amount of time, researching views on this topic. Just one example of many forums where the question of the existence of true randomness has been posed is Debate.org, where you can find the following discussion: “Philosophically and Rationally, Does Randomness or Chance Truly Exist?” Debate.org, accessed June 10, 2019, http://www.debate.org/opinions/philosophically-and-rationally-does-randomness-or-chance-truly-exist.
35Tossing a coin might not be as good a way of generating randomness as some people think. A 2007 study showed that there tends to be a slight bias toward a manually flipped coin landing the same way up as it was before being tossed: Persi Diaconis, Susan Holmes, and Richard Montgomery, “Dynamical Bias in the Coin Toss,” SIAM Review 49, no. 2 (April 2007): 211–35.
36An interesting website on the topic of randomness is Random.org, accessed June 10, 2019, https://www.random.org. You can learn more about the challenges involved in generating true randomness from physical sources, as well as generate your own true random numbers using a generator based on atmospheric noise.
37A well-publicized example in 2008 was the extremely poor pseudorandom number generator used in the Debian operating system for supporting an earlier version of the TLS protocol. This generator could generate only a small fraction of the “random” numbers that were required for the algorithms it supported: “Debian Security Advisory: DSA-1571-1 openssl—Predictable Random Number Generator,” Debian, May 13, 2008, https://www.debian.org/security/2008/dsa-1571.
38Wikipedia, s.v. “Key Derivation Function,” accessed June 10, 2019, https://en.wikipedia.org/wiki/Key_derivation_function.
39Arjen K. Lenstra et al., “Ron Was Wrong, Whit Is Right,” Cryptology ePrint Archive, [February 12, 2012], https://eprint.iacr.org/2012/064.pdf.
40This process is governed by a set of security standards for GSM, and for 3G and 4G security, that specify the cryptography used in mobile systems. See, for example, Jeffrey A. Cichonski, Joshua M. Franklin, and Michael J. Bartock, “Guide to LTE Security,” NIST Special Publication 800-187, December 21, 2017, https://nvlpubs.nist.gov/nistpubs/specialpublications/nist.sp.800-187.pdf.
41The Diffie-Hellman key agreement protocol is increasingly preferred over more classical hybrid encryption as a means of agreeing on a secret key. The primary reason is that it offers greater security in the event that a long-term private key is exposed (a property sometimes called perfect forward secrecy). Details of the Diffie-Hellman protocol can be found in almost any cryptographic textbook. The protocol first appeared in Whitfield Diffie and Martin E. Hellman, “New Directions in Cryptography,” IEEE Transactions on Information Theory 22, no. 6 (1976): 644–54.
42For example, a public key based on 256-bit elliptic curves is often represented by about 130 hexadecimal characters.
43A certificate authority can be any organization that users trust enough to issue certificates. Let’s Encrypt is an example of a noncommercial certificate authority established to encourage the widespread use of cryptography, particularly TLS, by issuing free certificates. For more information, see Let’s Encrypt, accessed June 10, 2019, https://letsencrypt.org.
44The aforementioned problems regarding poorly generated RSA keys cannot be solved by certification. Most of these poorly generated RSA keys were certified by CAs that were simply asserting who owned the keys, not vouching for the quality of the keys. The situation might be different when a certificate authority has extended responsibilities that include key generation. In this case, a certificate authority could develop a poor reputation from reported incidents involving bad practice and, as a result, become less trusted.
45Web browser providers should maintain lists of root certificates that they have agreed to support within their software. The list of those supported by Apple, for example, can be inspected at “Lists of Available Trusted Root Certificates in iOS,” Apple, accessed June 10, 2019, https://support.apple.com/en-gb/HT204132.
46You have almost certainly been in the situation when, during an attempt to access a web page, you received a certificate warning and just ignored it by clicking it away. We take a risk when we do this. While certificate warnings can arise through errors or failures to update certificates, they do also arise in situations where there is a more serious problem, such as when the web site is known to be untrustworthy.
47A detailed exploration of the intricacies of managing public-key certificates can be found in Johannes A. Buchmann, Evangelos Karatsiolis, and Alexander Wiesmaier, Introduction to Public Key Infrastructures (Springer, 2013).
48The seemingly endless reports of mass data breaches often concern databases that are insecurely maintained by organizations. The European Union General Data Protection Regulation (GDPR), which came into force in May 2018, is partly about trying to address incidents of this type.
49For guidance on how to really get rid of data, see, for example, “Secure Sanitisation of Storage Media,” National Cyber Security Centre, September 23, 2016, https://www.ncsc.gov.uk/guidance/secure-sanitisation-storage-media.
50For an introduction to hardware security modules, see Jim Attridge, “An Overview of Hardware Security Modules,” SANS Institute Information Security Reading Room, January 14, 2002, https://www.sans.org/reading-room/whitepapers/vpns/overview-hardware-security-modules-757.
51The original quote is from G. Spafford, “Rants & Raves,” Wired, November 25, 2002.
52See, for example, Arun Vishwanath, “Cybersecurity’s Weakest Link: Humans,” Conversation, May 5, 2016, https://theconversation.com/cybersecuritys-weakest-link-humans-57455.
53Any organization protecting laptops in this way would, hopefully, use a hardware security module and serious security management procedures to protect the master key, so our disastrous scenario should not unfold.
54A series of papers on the difficulty that users experience with encryption software began with Alma Whitton and J. D. Tygar, “Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0,” in Proceedings of the Eighth USENIX Security Symposium (Security ’99), August 23–26, 1999, Washington, D.C., USA (USENIX Association, 1999), 169–83. It was followed by Steve Sheng et al., “Why Johnny Still Can’t Encrypt: Evaluating the Usability of Email Encryption Software,” in Proceedings of the Second Symposium on Usable Privacy and Security (ACM, 2006); and then Scott Ruoti et al., “Why Johnny Still, Still Can’t Encrypt: Evaluating the Usability of a Modern PGP Client,” arXiv, January 13, 2016, https://arxiv.org/abs/1510.08555. You can see where this is going without even reading these articles.
55Note that sending users on a training course does not necessarily solve this problem. A system that is difficult to use may suffer from ongoing problems even after users have attended a formal training program. Unless humans regularly perform a complex task, we are quite likely to forget the skills acquired during training.
56The argument that bad cryptography is sometimes worse than no cryptography is made, for example, in Erez Metula, “When Crypto Goes Wrong,” OWASP Foundation, accessed June 10, 2019, https://www.owasp.org/images/5/57/OWASPIL2011-ErezMetula-WhenCryptoGoesWrong.pdf.
Chapter 8: The Cryptography Dilemma
1One of the most infamous ransomware attacks was WannaCry, which affected over 200,000 computers worldwide in May 2017. Defense measures against WannaCry were developed relatively quickly, limiting its damage. For an introduction to ransomware and how to deal with it, see, for example, Josh Fruhlinger, “What Is Ransomware? How These Attacks Work and How to Recover from Them,” CSO, December 19, 2018, https://www.csoonline.com/article/3236183/ransomware/what-is-ransomware-how-it-works-and-how-to-remove-it.html.
2For a list of real cases in which criminal investigators came up against encrypted devices, see Klaus Schmeh, “When Encryption Baffles the Police: A Collection of Cases,” ScienceBlogs, accessed June 10, 2019, http://scienceblogs.de/klausis-krypto-kolumne/when-encryption-baffles-the-police-a-collection-of-cases.
3There is some evidence that cyberattacks have been launched by law enforcement agencies against services using Tor. See, for example, Devin Coldewey, “How Anonymous? Tor Users Compromised in Child Porn Takedown,” NBC News, August 5, 2013, https://www.nbcnews.com/technolog/how-anonymous-tor-users-compromised-child-porn-takedown-6C10848680.
4The use of messaging services by terrorist groups has provoked some of the most contentious controversies over the use of encryption: Gordon Rayner, “WhatsApp Accused of Giving Terrorists ‘a Secret Place to Hide’ as It Refuses to Hand Over London Attacker’s Messages,” Telegraph, March 27, 2017, https://www.telegraph.co.uk/news/2017/03/26/home-secretary-amber-rudd-whatsapp-gives-terrorists-place-hide.
5In fact, encrypting and throwing away the key is sometimes proposed as a deliberate means of permanently deleting data on a disk. There are, however, some strong arguments for why this might not be the best way of disposing of data: Samuel Peery, “Encryption Is NOT Data Sanitization—Avoid Risk Escalation by Mistaking Encryption for Data Sanitation,” IAITAM, October 16, 2014, http://itak.iaitam.org/encryption-is-not-data-sanitization-avoid-risk-escalation-by-mistaking-encryption-for-data-sanitation.
6The case for inspecting incoming encrypted traffic is made in, for example, Paul Nicholson, “Let’s Encrypt—but Let’s Also Decrypt and Inspect SSL Traffic for Threats,” Network World, February 10, 2016, https://www.networkworld.com/article/3032153/security/let-s-encrypt-but-let-s-also-decrypt-and-inspect-ssl-traffic-for-threats.html.
7This is almost certainly what happened in the case of the Stuxnet infection of the Iranian Natanz uranium enrichment plant in 2010.
8The Electronic Frontier Foundation provides guidance on tools that can be used to protect online privacy, most of which are based on the use of cryptography: “Surveillance Self-Defense,” Electronic Frontier Foundation, accessed June 10, 2019, https://ssd.eff.org.
9Tom Whitehead, “Internet Is Becoming a ‘Dark and Ungoverned Space,’ Says Met Chief,” Telegraph, November 6, 2014, https://www.telegraph.co.uk/news/uknews/law-and-order/11214596/Internet-is-becoming-a-dark-and-ungoverned-space-says-Met-chief.html.
10“Director Discusses Encryption, Patriot Act Provisions,” FBI News, May 20, 2015, https://www.fbi.gov/news/stories/director-discusses-encryption-patriot-act-provisions.
11“Cotton Statement on Apple’s Refusal to Obey a Judge’s Order to Assist the FBI in a Terrorism Investigation,” Tom Cotton, Arkansas Senator, February 17, 2016, https://www.cotton.senate.gov/?p=press_release&id=319.
12“Apple-FBI Case Could Have Serious Global Ramifications for Human Rights: Zeid,” UN Human Rights Office of the High Commissioner, March 4, 2016, http://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx?NewsID=17138.
13Esther Dyson, “Deluge of Opinions on the Information Highway,” Computerworld, February 28, 1994, 35.
14David Perera, “The Crypto Warrior,” Politico, December 9, 2015, http://www.politico.com/agenda/story/2015/12/crypto-war-cyber-security-encryption-000334.
15“Snowden at SXSW: ‘The Constitution Was Being Violated on a Massive Scale,’” RT, March 10, 2014, https://www.rt.com/usa/snowden-soghoian-sxsw-interactive-914.
16Amber Rudd, “Encryption and Counter-terrorism: Getting the Balance Right,” Telegraph, July 31, 2017, https://www.gov.uk/government/speeches/encryption-and-counter-terrorism-getting-the-balance-right.
17Omand was speaking about the passing of the UK Investigatory Powers Act 2016, which regulates aspects of interception of communications data. The quote is from Ruby Lott-Lavigna, “Can Governments Really Keep Us Safe from Terrorism without Invading Our Privacy?” Wired, October 20, 2016, https://www.wired.co.uk/article/david-omand-national-cyber-security.
18Encryption (“cryptography for data confidentiality”) mechanisms were classified as a dual-use technology under the 1996 Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies: “The Wassenaar Arrangement,” accessed June 10, 2019, https://www.wassenaar.org.
19Even before cryptography entered mainstream use, the dilemma presented by the use of encryption existed because that dilemma is hardwired into the basic functionality of encryption. Encryption protects secrets. Mary, Queen of Scots, used encryption, which both protected her personal privacy and threatened the power of the state. Which of these mattered more is a question that depends on your point of view.
20By problematic I am not suggesting that these techniques should not have been pursued, but arguing that there are underlying difficulties with the approach that are very hard to mitigate.
21I am being deliberately provocative here. Nobody is asking for a cryptosystem to be made insecure. The state is asking for an alternative means of accessing the data protected by the cryptosystem. However, any such means, if deployed by the “wrong people” (such as criminals), would be regarded as a “break” of the cryptosystem.
22Normal users refers to essentially everyone except the state itself. This is a highly simplified scenario, as I hope you already realized.
23One of the first companies offering such a cryptographic product was Crypto AG, established in Switzerland in 1952 and still trading today: Crypto AG, accessed August 11, 2019, https://www.crypto.ch.
24There have long been rumors, partially substantiated, that in the 1950s Crypto AG cooperated with the NSA concerning sales of its devices to certain countries: Gordon Corera, “How NSA and GCHQ Spied on the Cold War World,” BBC, July 28, 2015, https://www.bbc.com/news/uk-33676028.
25Some governments almost certainly design their own secret algorithms to protect their data, which is fine if they have the expertise. Today, however, it would be naive for the government of Ruritania to blindly trust the government of Freedonia to supply it with technology that uses a secret Freedonian algorithm. Ruritania would be better served if it purchased commercial equipment that used state-of-the-art published algorithms.
26See, for example, Bruce Schneier, “Did NSA Put a Secret Backdoor in New Encryption Standard?” Wired, November 15, 2007, https://www.wired.com/2007/11/securitymatters-1115.
27The removal of Dual_EC_DRBG was announced in “NIST Removes Cryptography Algorithm from Random Number Generator Recommendations,” National Institute of Standards and Technology, April 21, 2014, https://www.nist.gov/news-events/news/2014/04/nist-removes-cryptography-algorithm-random-number-generator-recommendations. However, this action came nine years after Dual_EC_DRBG was approved as a standard for generating pseudorandom numbers. During this time it was adopted by several well-known security products, the manufacturers of some of which are alleged, controversially, to have cooperated with the NSA: Joseph Menn, “Exclusive: Secret Contract Tied NSA and Security Industry Pioneer,” Reuters, December 20, 2013, https://www.reuters.com/article/us-usa-security-rsa/exclusive-secret-contract-tied-nsa-and-security-industry-pioneer-idUSBRE9BJ1C220131220.
28Former NSA director Michael Hayden has suggested that some existing cryptosystems contain nobody-but-us (NOBUS) vulnerabilities, which are weaknesses known to the NSA that Hayden believes only the NSA could exploit. This is a very uncomfortable idea, requiring trust not just that the NSA’s exploitation of NOBUS vulnerabilities is ethical, but also that these weaknesses are not discoverable and exploitable by other parties: Andrea Peterson, “Why Everyone Is Left Less Secure When the NSA Doesn’t Help Fix Security Flaws,” Washington Post, October 4, 2013, https://www.washingtonpost.com/news/the-switch/wp/2013/10/04/why-everyone-is-left-less-secure-when-the-nsa-doesnt-help-fix-security-flaws.
29See, for example: “The Historical Background to Media Regulation,” University of Leicester Open Educational Resources, accessed June 10, 2019, https://www.le.ac.uk/oerresources/media/ms7501/mod2unit11/page_02.htm.
30Global Partners Digital maintains a world map that identifies national restrictions and laws concerning the use of cryptography: “World Map of Encryption Laws and Policies,” Global Partners Digital, accessed June 10, 2019, https://www.gp-digital.org/world-map-of-encryption.
31For a wider discussion of the politics surrounding cryptography in the latter decades of the twentieth century, including the issues of export controls, see Steven Levy, Crypto: Secrecy and Privacy in the New Cold War (Penguin, 2002).
32You can view the famous RSA munitions T-shirt (and even download original graphics to print your own) at “Munitions T-Shirt,” Cypherspace, accessed June 10, 2019, http://www.cypherspace.org/adam/uk-shirt.html.
33A good overview of some of the attitudes prevalent around this time toward cryptography and its ability to change society is Thomas Rid, Rise of the Machines (W. W. Norton, 2016). Another interesting perspective on this period is Arvind Narayanan, “What Happened to the Crypto Dream? Part 1,” IEEE Security & Privacy 11, no. 2 (March–April 2013): 75–76.
34Timothy C. May, “The Crypto Anarchist Manifesto,” November 22, 1992, https://www.activism.net/cypherpunk/crypto-anarchy.html.
35In 1991, Phil Zimmermann wrote the encryption software Pretty Good Privacy (PGP) and made it freely available. PGP was controversial in two ways: it used encryption strong enough to be subject to export controls in the US, and it deployed RSA, which was subject to commercial licensing restrictions. PGP soon made its way around the world and achieved broad acclaim. Zimmermann was the target of a US criminal investigation for the export control violation, which was eventually dropped.
36In 1995, cryptographer Daniel Bernstein brought the first of a series of cases against the US government, challenging the export restrictions on cryptography. A similar case, Junger v. Daley, was launched in 1996.
37An influential paper written by eleven cryptographic experts that critiques the idea of key escrow from a number of different angles is Hal Abelson et al., “The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption,” World Wide Web Journal 2, no. 3 (1997): 241–57.
38This slogan is generally associated with the crypto-anarchists of the 1990s (adapted from a similar slogan deployed by the US gun lobby) and can be found in a document prepared for the Cypherpunks mailing list: Timothy C. May, “The Cyphernomicon,” September 10, 1994, https://nakamotoinstitute.org/static/docs/cyphernomicon.txt.
39The UK Regulation of Investigatory Powers Act 2000 Part III (RIPA 3) gives the state the power to compel, under warrant, the disclosure of encryption keys or decryption of encrypted data. There have been convictions in the UK under RIPA 3. Forgetting or losing a key is not strictly a defense, but it’s possible to imagine situations where this could be a plausible defense argument.
40Just try searching for this phrase on the internet. You’ll be amazed by the number of articles taking (versions of) this phrase as their title.
41Some of the information from the thousands of documents leaked by Snowden found its way into press articles in, for example, the Guardian, the Washington Post, the New York Times, Le Monde, and Der Spiegel. There are many resources concerning the Snowden revelations. The story behind the leaks is covered in Glenn Greenwald, No Place to Hide (Penguin, 2015); and Citizenfour (directed by Laura Poitras, HBO Films, 2014); and dramatized in Snowden (directed by Oliver Stone, Endgame Entertainment, 2016).
42There are several repositories on the internet claiming to hold many of these documents, including “Snowden Archive,” Canadian Journalists for Free Expression, accessed June 10, 2019, https://www.cjfe.org/snowden.
43In May 2019 a weakness was reported in the WhatsApp messaging service that gave attackers access to smartphone data. Worryingly, the trigger required no user participation; the attack could be launched by a simple call to the phone: Lily Hay Newman, “How Hackers Broke WhatsApp with Just a Phone Call,” Wired, May 14, 2019, https://www.wired.com/story/whatsapp-hack-phone-call-voip-buffer-overflow.
44Ed Pilkington, “‘Edward Snowden Did This Country a Great Service. Let Him Come Home,’” Guardian, September 14, 2016, https://www.theguardian.com/us-news/2016/sep/14/edward-snowden-pardon-bernie-sanders-daniel-ellsberg.
45The problems created by the complexity of cyberspace are exacerbated by the increased functionality of devices. Phil Zimmermann has proposed that “the natural flow of technology tends to move in the direction of making surveillance easier, and the ability of computers to track us doubles every eighteen months”: Om Malik, “Zimmermann’s Law: PGP Inventor and Silent Circle Co-founder Phil Zimmermann on the Surveillance Society,” GigaOm, August 11, 2013, https://gigaom.com/2013/08/11/zimmermanns-law-pgp-inventor-and-silent-circle-co-founder-phil-zimmermann-on-the-surveillance-society.
46Danny Yadron, Spencer Ackerman, and Sam Thielman, “Inside the FBI’s Encryption Battle with Apple,” Guardian, February 18, 2016, https://www.theguardian.com/technology/2016/feb/17/inside-the-fbis-encryption-battle-with-apple.
47Danny Yadron, “Apple CEO Tim Cook: FBI Asked Us to Make Software ‘Equivalent of Cancer,’” Guardian, February 25, 2016, https://www.theguardian.com/technology/2016/feb/24/apple-ceo-tim-cook-government-fbi-iphone-encryption.
48Rachel Roberts, “Prime Minister Claims Laws of Mathematics ‘Do Not Apply’ in Australia,” Independent, July 15, 2017, https://www.independent.co.uk/news/malcolm-turnbull-prime-minister-laws-of-mathematics-do-not-apply-australia-encryption-l-a7842946.html.
49Metrics concerning the size of the Tor network can be found at “Tor Metrics,” Tor Project, accessed June 10, 2019, https://metrics.torproject.org/networksize.html.
50See, for example, Hannah Kuchler, “Tech Companies Step Up Encryption in Wake of Snowden,” Financial Times, November 4, 2014, https://www.ft.com/content/3c1553a6-6429-11e4-bac8-00144feabdc0.
51The UK “National Cyber Security Strategy 2016–2021,” HM Government, 2016, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/567242/national_cyber_security_strategy_2016.pdf, explicitly identifies the importance of widespread use of encryption by noting that “cryptographic capability is fundamental to protecting our most sensitive information and to choosing how we deploy our Armed Forces and national security capabilities.”
52In 2015, a group of leading cryptographers outlined the security risks created by a range of approaches to facilitating law enforcement access to encrypted communications. See Hal Abelson et al., “Keys under Doormats,” Communications of the ACM 58, no. 10 (2015): 24–26.
53“Remarks by the President at South by Southwest Interactive,” White House, Office of the Press Secretary, March 11, 2016, https://obamawhitehouse.archives.gov/the-press-office/2016/03/14/remarks-president-south-southwest-interactive.
54Some means of gaining legal access to data are probably more acceptable to society than others; hence, a holistic view of the issue is the best way of identifying the most acceptable trade-offs: Andrew Keane Woods, “Encryption Substitutes,” Hoover Institution, Aegis Paper Series no. 1705, July 18, 2017, https://www.scribd.com/document/354096059/Encryption-Substitutes#from_embed.
55I say traditionally because mobile and fixed telephone networks are increasingly merging, with the encryption that used to protect only the first leg between mobile phone and base station now extending deeper into the networks than it used to.
56To be clear, my point is that if we were to redesign the architecture of the internet and renegotiate security within that architecture, we should consider what level of security is needed for specific services. It is evident that, today, end-to-end encryption raises genuine concerns for law enforcement. A true negotiation to find an acceptable way forward requires all sides to come to the table willing to make compromises. The only likely alternative is ongoing conflict.
57“Treaty between the United States of America and the Union of Soviet Socialist Republics on the Limitation of Strategic Offensive Arms (SALT II),” Bureau of Arms Control, Verification and Compliance, 1979, https://2009-2017.state.gov/t/isn/5195.htm.
58Daniel Moore and Thomas Rid, “Cryptopolitik and the Darknet,” Survival 58, no. 1 (2016): 7–38.
Chapter 9: Our Cryptographic Future
1To be strictly correct, the analogy is that you place a copy of the letter in each of the new boxes and then give these to your enemies, each of whom now has one copy of the letter in a box that they can break into, and one copy in a box that they cannot break into.
2This argument is particularly relevant to encryption. For other cryptographic services, such as data integrity, the situation might not be so serious. For example, if a digital-signature algorithm is broken and requires upgrading, it is possible to re-sign data using the new signature algorithm. A problem would arise only if the digital-signature algorithm used at the time the data was first signed wasn’t strong enough.
3Because cryptographic algorithms are often computationally intensive, in many applications of cryptography they are implemented in hardware, rather than software. This means that a change of algorithm often requires replacement hardware. For example, when the WEP Wi-Fi security protocol was declared obsolete in 2003 because a series of cryptographic weaknesses had been discovered, not all Wi-Fi devices could be updated to new security protocols. Therefore, some Wi-Fi users were faced with the choice of purchasing a new device or continuing to use broken cryptography.
4The perception of quantum as being mysterious, counterintuitive, and beyond the comprehension of most of us seems to be nurtured by the expert community. Nobel Prize–winning physicist Niels Bohr is often attributed as being the original source of the quote “Anyone who is not shocked by quantum theory has not understood it.” Even popular explanations of quantum concepts tend to be framed from a position of the seemingly unknowable; examples include Jim Al-Khalili, Quantum: A Guide for the Perplexed (Weidenfeld & Nicolson, 2012); and Marcus Chown, Quantum Theory Cannot Hurt You (Faber & Faber, 2014).
5Quantum random numbers have been commercially available since the turn of the century and are based on different types of quantum measurement. See, for example, “What Is the Q in QRNG?” ID Quantique, October 2017, https://www.idquantique.com/random-number-generation/overview; and “NIST’s New Quantum Method Generates Really Random Numbers,” National Institute of Standards and Technology, April 11, 2018, https://www.nist.gov/news-events/news/2018/04/nists-new-quantum-method-generates-really-random-numbers.
6A relatively accessible insight into the development of quantum computers is John Gribbin, Computing with Quantum Cats: From Colossus to Qubits (Black Swan, 2015).
7In the unlikely event that you have never hurled an angry bird at a pig, this is a reference to the core synopsis of Rovio Entertainment Corporation’s phenomenally successful series of Angry Birds games.
8Predictions about the future development timelines for quantum computing vary, as do views on what their eventual impact will be. The consensus appears to be that we will have powerful quantum computers . . . someday!
9An algorithm published by mathematician Peter Shor in 1994, now known as Shor’s algorithm, demonstrated that a quantum computer could solve both the factorization and discrete logarithm problems. The original paper is Peter W. Shor, “Algorithms for Quantum Computation: Discrete Logarithms and Factoring,” in Proceedings, 35th Annual Symposium on Foundations of Computer Science (IEEE Computer Society Press, 1994), 124–34. Shor’s algorithm has subsequently been used to factor relatively small numbers on fledgling quantum computers.
10In 2016, the US National Institute of Standards and Technology launched a program to design and evaluate postquantum asymmetric encryption algorithms that are believed to be secure against a quantum computer. This process is anticipated to take at least six years: “Post-quantum Cryptography Standardization,” NIST Computer Security Resource Center, updated June 10, 2019, https://csrc.nist.gov/Projects/Post-Quantum-Cryptography/Post-Quantum-Cryptography-Standardization.
11In 1996, computer scientist Lov Grover proposed an algorithm, now referred to as Grover’s algorithm, that shows how a quantum computer can speed up an exhaustive search for a key by a square root factor. This means that an exhaustive search of, say, 2128 keys on a quantum computer will take “only” the time required for an exhaustive search of 264 keys on a conventional computer. Hence, symmetric-key lengths need to double in order to maintain equivalent security levels against a quantum computer. However, it is worth noting that this algorithm requires vast amounts of quantum memory. The original paper is Lov K. Grover, “A Fast Quantum Mechanical Algorithm for Database Search,” in Proceedings of the Twenty-Eighth Annual ACM Symposium on the Theory of Computing (ACM, 1996), 212–19.
12An accessible explanation of quantum key distribution can be found in Simon Singh, The Code Book (Fourth Estate, 1999).
13A review of some of the practical challenges facing the deployment of quantum key distribution can be found in Eleni Diamanti et al., “Practical Challenges in Quantum Key Distribution,” npj Quantum Information 2, art. 16025 (2016).
14The one-time pad is an extremely simple encryption algorithm whose modern form is sometimes called the Vernam cipher. It involves encrypting every plaintext bit into a ciphertext bit by adding a randomly generated key bit. In 1949, the one-time pad was shown by Claude Shannon to be the only “perfect” encryption algorithm, in the sense that an attacker cannot learn anything (new) about an unknown plaintext by observing a ciphertext. Unfortunately, the stringent requirement for truly random keys that are as long as the plaintext and must be freshly generated for every single encryption makes the one-time pad impractical to use in most situations.
15Internet-enabled versions of all these objects were available as commercial products in 2017: Matt Reynolds, “Six Internet of Things Devices That Really Shouldn’t Exist,” Wired, May 12, 2017, https://www.wired.co.uk/article/strangest-internet-of-things-devices.
16Reliable projections for the extent of the future IoT environment are hard to compile, but organizations such as Gartner and GSMA Intelligence consistently predict numbers on the order of 25 billion global connected IoT devices by 2025. The precise figures don’t matter; there are going to be loads of them!
17Many internet-connected devices are sold with poor security protection or even none. A major future challenge is to persuade suppliers, retailers, and regulators to ensure that IoT technology is sufficiently secure. See, for example, “Secure by Design: Improving the Cyber Security of Consumer Internet of Things Report,” Department for Digital, Culture, Media & Sport, UK Government, March 2018, https://www.gov.uk/government/publications/secure-by-design.
18In August 2018, the National Institute of Standards and Technology launched an AES-style competition to develop new cryptographic algorithms suitable for deployment in constrained environments where conventional algorithms, such as the AES, are not suitable: “Lightweight Cryptography,” NIST Computer Security Resource Center, updated June 11, 2019, https://csrc.nist.gov/Projects/Lightweight-Cryptography.
19David Talbot, “Encrypted Heartbeats Keep Hackers from Medical Implants,” MIT Technology Review, September 16, 2013, https://www.technologyreview.com/s/519266/encrypted-heartbeats-keep-hackers-from-medical-implants.
20The most obvious risks are that the data is observed, corrupted, or lost. However, a more likely consequence is that the data is exploited. Indeed, for many (free) cloud storage services it is possible that the exploitation of users’ data lies at the heart of the commercial proposition.
21For an overview and additional references to cryptography designed for cloud storage environments, see, for example, James Alderman, Jason Crampton, and Keith M. Martin, “Cryptographic Tools for Cloud Environments,” in Guide to Security Assurance for Cloud Computing, ed. Shao Ying Zhu, Richard Hill, and Marcello Trovati (Springer, 2016), 15–30.
22The first fully homomorphic encryption (FHE) scheme was proposed by Craig Gentry in “A Fully Homomorphic Encryption Scheme” (PhD diss., Stanford University, 2009), https://crypto.stanford.edu/craig/craig-thesis.pdf. Unfortunately, this scheme is completely impractical, being slow and computationally heavy to use. David Archer of the research and development firm Galois acknowledged in 2017 that, although his mission was to make FHE “practical and usable,” while speeds were improving, “we’re still not near real-time processing”: Bob Brown, “How to Make Fully Homomorphic Encryption ‘Practical and Usable,’” Network World, May 15, 2017, https://www.networkworld.com/article/3196121/security/how-to-make-fully-homomorphic-encryption-practical-and-usable.html.
23Peter Rejcek, “Can Futurists Predict the Year of the Singularity?” Singularity Hub, May 31, 2017, https://singularityhub.com/2017/03/31/can-futurists-predict-the-year-of-the-singularity/#sm.00001v8dyh0rpmee8xcj52fjo9w33.
24For good introductions to artificial intelligence and how developments might affect human society, see Max Tegmark, Life 3.0: Being Human in the Age of Artificial Intelligence (Penguin, 2018); and Hanna Fry, Hello World (Doubleday, 2018).
25These figures come from “Data Never Sleeps 6.0,” Domo, accessed June 10, 2019, https://www.domo.com/learn/data-never-sleeps-6.
26The phenomenon of massive-scale data collection and processing is sometimes referred to as big data. For good introductions to the possible implications of big data, see Viktor Mayer-Schonberger and Kenneth Cukier, Big Data: A Revolution That Will Transform How We Live, Work and Think (John Murray, 2013); and Bruce Schneier, Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (W. W. Norton, 2015).
27An interesting report on the potential impact of developments in artificial intelligence on cybersecurity is Miles Brundage et al., “The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation,” February 2018, https://maliciousaireport.com.
28My observations concerning the intimate relationship between cryptography and trust are inspired by a talk given by Professor Liqun Chen at the First London Crypto Day, Royal Holloway, University of London, June 5, 2017.
29Lexico, s.v. “Trust,” accessed June 12, 2019, https://www.lexico.com/en/definition/trust.
30Although the Snowden revelations concerned ways in which governments were attempting to manage cryptography, these revelations inevitably led some people to mistrust cryptography itself.
31An excellent read on the wider construction of trust in society, with a perspective on constructing trust in cyberspace, is Bruce Schneier, Liars and Outliers: Enabling the Trust That Society Needs to Thrive (Wiley, 2012).
32For details of past and future Real World Crypto symposia, see “Real World Crypto Symposium,” International Association for Cryptologic Research, accessed June 12, 2019, https://rwc.iacr.org.