In Chapter 6 we discussed entanglement in the context of Bell's inequality. We examine now a few applications that make use of this strange phenomenon.
INTRODUCTION TO CHAPTER 8
The development of quantum computers is driven in part by an ongoing commercial, governmental, and military need to transmit information in secret, and a corresponding need in the State Department and the military to be able to break into and read coded information. It is not just that the quantum computer can operate faster than a classical computer for some applications, but rather that it can operate in an entirely different manner by simultaneously addressing the problem at hand through computations in multiple paths, all using the same circuit elements. The same quantum properties promise faster computation for many other applications, including accurate simulations of physical situations in our (quantum) world.
While quantum computers will allow the cracking of classical codes, quantum properties provide new methods of encryption, codes that would seem as though they can in no way be broken. And any messages encoded in this manner cannot be tampered with without the sender and receiver knowing about it. These same properties make quantum copying or cloning impossible, while in principle allowing a form of teleportation, something not even theoretically possible by any classical means.
Please realize that I will be selective in my description of the topics in this chapter. I provide only enough for basic understanding and a sense of the exciting opportunities that lie ahead. For an entertaining and more comprehensive consideration of the history of development in quantum computers, encryption, and related physics, I suggest that you read Computing with Quantum Cats, by John Gribbin (Reference Z). For a somewhat-deeper examination of the essential elements of classical and quantum information theory, computers, cryptography, cloning, and teleportation, I similarly suggest lectures 19 through 22 of Quantum Mechanics: The Physics of the Microscopic World, by Benjamin Schumacher of Kenyon College (Reference Y). And, for a good recent overview not only updating broadly the development of quantum-computer elements but also indicating the hardware and software steps necessary to produce functioning quantum computers, I would recommend the YouTube video presentation by Krysta Svore of Microsoft, shared on October 23, 2014.1 To provide a context and define basic terminology, I start now with an examination of what can be done with classical information.
CLASSICAL INFORMATION IN BINARY DIGITAL FORM
I find it convenient to borrow a description from Schumacher's lecture 22. He starts out by asking us to remember the quiz game Twenty Questions, noting that every yes or no answer to the questions asked represents a single “bit” of information—in all, twenty bits. (The answers are binary: that is, they have only two possibilities: yes or no. These answers may be represented by the digits 1 and 0, hence the term “bit,” for “binary digit.”) Schumacher points out that the various combinations of these twenty bits, if applied toward guessing a particular word,2 are sufficient to sort through all but a few of a million possible words in the English dictionary. Said another way, all of these million words can be represented by different combinations of these twenty bits.
This seems rather incredible from twenty pieces of information, but we can see how this may happen considering just a few bits to start with. The first bit allows either of two possible answers to the first question, yes or no, represented by 1 or 0. For each of these, the second bit allows 1 or 0, so that there are four combinations in all. For each of these four combinations there are two possible answers to the third question, making eight combinations in all. Then there will be sixteen combinations with the fourth bit and 32, 64, 128, and so on, as more bits are added. The number of combinations goes up pretty fast. We write the number of combinations in mathematical shorthand as 2n, where the exponent n represents the nth bit, and the expression means that 2 is multiplied by itself n times. And so, we say that the number of possible combinations increases exponentially. So, in the game of Twenty Questions, we find that the number of bits, 220, is exactly 1,048,576—about a million. Each combination of 20 bits could, for example, represent or be a code for any one of over a million different words. Or each combination could represent one number between 1 and 1,048,576. (Twenty-one bits, for example, could represent any one of 221 words, or any number between 1 and 2,097,152—twice as many as 220). We would say that these words or numbers are represented in a 20- or 21-bit binary register.
Instead of representing whole words (using a 21-bit register as indicated above), one can use the 27 = 128 combinations of a seven-bit register to represent any one item in a set including all of the following items: the twenty-six lowercase letters of the English alphabet; the twenty-six uppercase letters; the digits 0 through 9, and the punctuation, parentheses, and other common symbols such as those that exist on keyboards. Another of the combinations may be used to indicate an instruction, such as to begin a paragraph, and another may instruct either to begin or end a title or heading. Such a coding is precisely what is used to represent text in a binary digital transmittable form, in what is called the American Standard Code for Information Interchange (ASCII). This system has been used to enter, store, and display text in computers and (until 20083) transmit text over the World Wide Web (the Internet).
THE STORAGE AND COPYING OF CLASSICAL INFORMATION
The storage of all information is physical. That is, information is marked or contained in a tangible, physical way in substances or devices. There was a time when one would have to hand copy a letter of some importance to remember what he or she had written or to send information to more than one recipient. Then came the printing press, which allowed multiple copies to be easily made—by some accounts this was the most important invention in human history. In modern times, we store or copy information as photocopies, audio or video recordings, photographs, e-books, or digital files. Information can be copied and transferred faithfully, especially if stored in digital form. And copying has made our lives easier in many ways. But then, copying has also caused a problem, as when copies are made in violation of copyrights.
CLASSICAL ENCRYPTION, AND THE CODE-BREAKING MACHINES OF WWII
The need for secrecy and the need to discover another's secrets has spurred an evolving “arms race” in the invention of coding methods and the corresponding invention of the means for code breaking (cryptanalysis) throughout history. Simple codes have been used since ancient times, including the substitution of one letter for another in any particular language. But in the common use of language there is a frequent repetition of certain combinations of letters (such as “th”) or the frequent use of certain words (such as “the” and “and”) that allow the codes to be broken with a relatively small amount of effort. Such code-breaking problems are often presented in the form of puzzles opposite the comic pages in our newspapers.
I introduce now, for illustration, a common, much more effective type of code that is not so easily broken, and one with a history, the one-time pad.
Suppose that Alice has a message that she wants to transmit in secret to Bob, her stockbroker. She wants to tell him to sell IBM stock. She can send a message in the following secure manner. (To simplify, we illustrate by coding to transmit only the letter “S,” perhaps to indicate “sell.”)
A. Alice expresses the letter S (the plaintext) as indicated below in ASCII form as a series of 1s and 0s. (Anyone knowledgeable would recognize this as ASCII and would be able to read the message.)
B. Alice adds to our ASCII message another arbitrary, randomly generated string of 1s and 0s (the key), known only to her as the sender and to Bob, the receiver.
C. Using the simple rule that any two 1s or any two 0s add to zero, but a 1 and a 0 add to 1, we get a digital sum called the ciphertext. If someone intercepts this ciphertext and uses the ASCII conversion to decode it, they will get another symbol or letter or number, not “S.” (For the random string of key digits that we have chosen, an ASCII conversion actually yields a percent sign “%.”) Were the ciphertext intercepted and read using ASCII, it would make no sense whatsoever.
D. Bob, however, adds the key to the ciphertext, and voilà!
E. He has the plaintext message that Alice wished to send to him, and he can convert using the ASCII chart to reveal the letter “S.”
In the above example, Alice has sent information to Bob in a ciphered binary string of numbers that no one except Bob can decode. Because Bob has the key that Alice has provided separately, he can easily decode the ciphered information to get the plaintext, and then use the ASCII conversion chart to read Alice's intended message.
Because sending different messages using the same key would soon allow someone else to deduce what the key is, this method of coding can only be used once or a couple of times with each key. And, of course, the success of this method requires that Alice be able to provide Bob with the key without anyone else gaining access to it, for then, they too would be able to decipher the secret message. The problem then is with key distribution. One famous (or infamous) example of key distribution involved the ciphered transmission of the rotor settings (the key) for the Enigma machine used by Germany just preceding and during World War II.
Many of us are aware of the Enigma machine, a system of rotors that provided the key for coding and decoding. Some of us saw the movie The Imitation Game, which is about the British effort to break its codes. The rotors could be set to work together in different ways to change the key, and instructions for the rotor settings would be sent out separately or somehow along with the coded message so the intended recipient could set the key. Without knowing the settings of the rotors of the machine, the key could not be determined.
Germany used ever more sophisticated versions of these machines to successfully send coded messages before and throughout World War II. It was the Poles who as early as 1932 mounted the strongest effort to break these codes, sharing their information with the French and the British and building the first computing machine for this purpose (called Bombas). The machine used electromechanical switches called relays to sort through the possible solutions to Enigma.
After the German invasion of Poland in 1939, the British—particularly benefiting from the genius of Alan Turing—were able to build a much more advanced and powerful machine, called Bombes; it was seven feet high by seven feet wide, and it was able to simulate the workings of thirty Enigma machines all wired together. However, the Allies were more often able to break these codes, not through any deficiency in Enigma, but through lapses in the way that the Germans used the machines, where, for example, repeated messages could be spotted and examined for patterns.
As Gribbin describes it, Turing designed Bombes to take advantage of the lapses, and his efforts probably shortened the war by two years, or managed to keep Britain in the war at all. He cites one example in particular: In the summer of 1941, Britain was on the brink of starvation. Thanks to the code breakers and Bombes located at England's Bletchley Park, convoys from the United States to England went twenty-three days without a single sinking, being warded away from the German U-boats with information provided from Bletchley.
But the Germans built an even more powerful successor to Enigma, called Tunney. Beating Tunney would be labor intensive. Gribbin paraphrases Turing: To comb for lapses “would require 100 Britons working eight hours a day on desk calculators for 100 years.”4 It became apparent that a newer sort of machine would be needed to break the codes. The first prototype of this machine began operating in Britain in June 1943, once again using electrical relays. Its successor, another prototype, this time used nearly two thousand electronic tube switches (somewhat like the radio tubes that preceded semiconductor transistor switches, to be described shortly). It was put into operation at Bletchley Park in 1944. Called Colossus, it filled an entire room. This was the first electronic computer, but it was programmable only in a limited sense.
MODERN CLASSICAL COMPUTERS (FAMILIAR TO MOST OF US)
Rather than tubes, modern computers use semiconductor electronic switches, transistor bits that are caused to carry either a small electrical current or no current, to indicate either a 1 or a 0, respectively. Sixty-five years ago, these switches were the size of a dime. Since that time, manufacturing technology has steadily advanced to reduce their size and integrate multiples of them and their associated electrical circuitry on wafers of silicon as single chips. In 1965, Gordon Moore, cofounder of Intel and Fairchild Semiconductor, noted that the number of components being manufactured in such an integrated circuit was doubling every eighteen months (mainly by reducing the size of the transistors and other circuit elements within the chip). This doubling rate has become known as Moore's law. It continues to describe progress to this day. (But the sizes of classical circuit elements have now been reduced to just one hundred times the size of atoms, quantum entities that loom as a fundamental limit to how small one can go. So, in ten to twenty years, Moore's law will probably no longer hold. Improvement in the capacity of classical computers through the reduction of element size will probably come to an end, and continued progress will require operation in a quantum realm. We'll soon get on to discussing quantum computers, but first let's examine what happens in classical computation.)
The information that is stored in classical electronic circuits is measured in bytes, where each byte represents eight bits of information. Today, each everyday desktop or laptop computer may store hundreds of billions of bytes, that is, hundreds of gigabytes of information.
Note that ten million bytes (that is, ten megabytes = ten thousand kilobytes = 10,000 KB) may be needed to store the information contained in an average book or photograph, or a sound recording. So the modern computer is capable of storing tens of thousands of these items.
But the computer does not just store information. It processes the information according to instructions that it has been given. The computer processor manipulates information using transistor switches that are “wired” in gates to perform logical operations. For example, the XOR gate (= Exclusive OR gate) will operate to produce a 0 if two input pieces of information are either both 1 or both 0. Otherwise, it will produce a 1. (These are the very operations that Bob used to combine the key with the ciphertext in our description of the “one-time pad” in the indented illustration A through E above.)
The ten basic logic gates (all that are required to perform any logical operation or calculation) may be constructed as simple electrical circuits using combinations of transistors. For example, the XOR gate uses eight transistors appropriately wired together to take two input signals and yield one output signal according to whether the inputs are similar or not, as noted above. The input signals may come from other transistors, where the input “1” comes from a transistor which is “on” (carrying an electrical current in one direction), and the input “0” comes from a transistor which is “off,” that is, carrying zero or little electrical current in that same direction. The transistor currents are turned on or off depending on electrical inputs brought to them from outside sources or still other transistors.
On another level, the computer may search and decide: “If x,” “but not y,” “then z,” where x may mean “to have an appointment” and y may mean “to have a car” while z may mean “then take a bus.”
Part of a processor's speed is determined by its clock, which puts out a regular electrical signal that coordinates the action of the digital circuits containing the transistors. The more frequent the clock signal, the faster the computer. But the clock signals must be spaced apart enough in time (be of low-enough frequency) that the transistors can complete their switching (of the 1s and 0s) between signals.
The first general-purpose electronic computer, the ENIAC, was designed to produce artillery firing tables for the US Army. But it was soon diverted under the influence of the mathematician John von Neumann (then working at Los Alamos on the Manhattan Project) to run calculations on the feasibility of producing a hydrogen bomb. ENIAC's existence was announced to the public in 1946. It was 8 × 3 × 100 feet in size, contained 17,468 vacuum-tube electronic switches (bits), and ran with a 100-thousand-cycle-per-second clock (=100 kilohertz = 100 kHz). Each instruction took twenty clock signals to process, and so it had an instruction processing rate at 5 kHz.
For comparison, note that my Mac mini desktop computer (which is four years old at the time of this writing) has a 2 gigabyte memory (2 billion bytes, 16 billion bits, a million times the bits in ENIAC) and runs with a clock speed of 2.4 billion clock signals per second (2.4 GHz), that is, 24,000 times faster than ENIAC.
A CLASSICALLY UNBREAKABLE CODE
As powerful as modern computers are, there are many things that they still cannot do, and that includes the breaking of certain codes. For one modern approach to coding, called public key encryption, the “public key” that is used to code a message cannot be used to decode the message. The reading (or breaking) of the coded message requires a related “private key.” So, for example, Alice, who is to receive messages from Bob, may send to Bob as his public key a large number created by multiplying two large prime numbers together.
Prime numbers are numbers that have only two factors: the number 1 and the prime number itself. Said another way, prime numbers can only be divided (without generating fractions) by 1 and the prime number itself. Prime numbers include 2, 3, 5, 7, 11, and so on.
The encryption process in this case is not the simple addition rule noted in “C” of our boxed inset a bit earlier, and the public key can only be used to encode, not to decipher. So, Alice doesn't care if this public key is intercepted, because anyone having this number still will not be able to decode whatever Bob codes with it. Bob uses the public key to send a coded message to Alice. Alice uses her private key to decode the message that Bob sends. (Her private key is one of the prime numbers used to form the public key in the first place.) Alice never tells anyone what her private key is. And she doesn't have to send her private key anywhere, so no one can intercept or observe it.
Encryption using public keys formed of multiples of very large prime numbers is used by banks and the military to protect their transmissions. Such encryptions are also used to protect the information that you send when you enter your credit card information to purchase online.)
Decoding the message or information protected using the private key requires either knowing or finding one of the two prime numbers. (Having one prime and knowing the public key product allows an easy calculation of the second prime. The first prime is just divided into the number of the public key.) So, someone trying to crack the code must figure out what the primes are. Whereas, for example, finding the primes 3 and 5 that are multiplied to make 15 requires only a little trial and error, finding the primes of a number 400 digits long (such as may be used for the protection of credit card information) would likely take billions of years, even using the best of modern (classical) computers. And newer computers to come will not significantly alter the fact that, as a practical matter, this factoring can't be done in any reasonable length of time using classical computers.
QUANTUM COMPUTING
By contrast, a quantum computer having a processor of only 50 quantum bits would be able to run through the necessary code-breaking algorithm to find the private key in a few minutes.
An algorithm is a self-contained set of logical or mathematical operations, such as “if a, then b” or “add x and y.” By carrying out a set of these operations, a computer can make judgments or perform calculations based on input information.
No wonder, then, that efforts to build quantum computers are proceeding in a race funded by governments, big business, and the military. (As just one example of this, note that documents provided by Edward Snowden show that the US National Security Agency is funding a nearly $80 million program to develop a quantum computer “capable of breaking vulnerable codes.”5)
Because it is a quantum device, the calculations of a quantum computer ultimately involve probabilities, and so the result of a single computation may not be totally accurate. For the factoring task discussed above, this is not a problem, since the computer can always check its answers simply by multiplying the factors that it gets to see if their product matches that of the public key input. But, more generally, multiple calculations are run to provide a set of answers that will center around and define the correct answer. One can ask the computer to run the calculation as many times as is necessary to get an answer to a desired high level of certainty. This may extend the required computing time on complicated problems from seconds to perhaps a couple of hours, but that is still a very short time compared to the billions of years that may be needed to solve some problems or crack some codes using classical computers.
There are other intractable problems and classes of applications that may be addressed using quantum computers. These include faster search engines (like Google), rapid face and speech recognition (e.g., to identify one sought person—perhaps a terrorist—whose face appears in a photograph or video of a crowd of people), the simulation of all manner of physical and chemical processes, the design of new drugs and exotic materials, and efficient routing of people or the distribution of product. (Routing a salesperson efficiently to 14 different locations so that they don't incur unnecessary travel might take 100 seconds of classical computer time. But routing that same person to 22 destinations would require some 1,600 years using the best of today's computers.) Certain kinds of sorting problems are amenable to quantum computation. The example often used, just for illustration, is the reverse phone book, where one sorts to find the owner of a particular phone number. A classical computer can easily do this by performing, on average, perhaps a million operations (for a city of two million people). A quantum computer may do it with just a thousand operations (the square root of a million).
The quantum bits at the heart of the quantum computer behave much differently than classical bits.
Quantum bits are commonly referred to as qubits. The term was coined in 1992 during a discussion between Professor Benjamin Schumacher and William Wootters, then of Williams College, initially as a pun in relation to the biblical measurement of length, the cubit. (For instance, Noah's ark was so many cubits long, so many wide, and so many high.)
What distinguishes qubits is that they can be in a superposition of both of two states simultaneously or be in either of the two states, based on probabilities. This is not a simple statement. Qubits can be formed and behave in this manner because of the fundamental quantum makeup of our world.
To illustrate how qubits might be created physically, consider again the single electron in the hydrogen atom.
The overall wavefunction for the electron is a superposition of all of the possible electron's states, represented by all of their wavefunctions and all of their associated probability clouds (like those shown in Fig. 3.8), all superimposed and centered around the nucleus of the atom. In nature, things tend toward lowest energy, and so the probability of finding the electron (upon measurement) in its lowest energy “ground” state is greater than for the higher energy states. This probability information is also contained in the overall wavefunction. The wavefunctions may evolve and change with time. But until an observation (measurement) is actually made, the electron remains in a superposition of states and clouds according to the probabilities of the overall wavefunction.
The act of observation (measurement) selects one of the allowed states. But as soon as the observation is over, probabilities take over again, and another, different, overall wavefunction describes the electron and its new evolution in time.
Hypothetically, for illustration, suppose that a qubit can be created in a hydrogen atom by controlling an electron to switch occupancy between just two particular states, one representing a “0” and one representing a “1.” (The switching might be caused by shining laser light of just the right frequency on the atom, either inducing a transition of the electron from the higher to the lower energy state or, through the absorption of a single photon, pumping the electron up from the lower energy state to the higher energy state.) In this two-state arrangement, the overall state could be described as a combination of the two states, with the overall wavefunction containing the information on the probability of finding the electron in one of these states or the other.
As you will soon see, there are many types of particles and properties that are being developed to produce a binary combination of qubit states in somewhat this manner. In such binary physical systems, a combination of two states (and two-state wavefunctions) describes each qubit.
Two qubits in proximity can affect each other if their wavefunctions significantly overlap. In the right circumstances, each qubit can actually be linked by entanglement with the other qubit, so that together they are described by a new overall evolving wavefunction that treats them as a single entity. And this entanglement, once established, under the right conditions can be made to persist even if the two physical qubit objects then move far away from each other. So making an observation or measurement on one of the qubits will not only choose its state, but it will also immediately determine the state of its entangled partner, no matter how far away it is. (As you may recall, this ability of particles to stay entangled over long distances of separation is what Einstein called “spooky action at a distance.”)
The pair of qubits can be in any of the 22 = four combinations of their states or in a superposition of all four combinations simultaneously. (These four two-qubit combinations can represent the four binary digit combinations: “0, 0,” “0, 1,” “1, 0,” and “1, 1.”) As many as n qubits can all be further linked, in which case 2n possible values (or numbers) can be simultaneously digitally represented and stored by n different strings, each of n 0s and 1s. (This is distinct from n classical bits, which can represent and store only one of the 2n values at a time.) And remember, if n = 20, then 220 = 1,048,576. That's a lot of simultaneously stored numbers for just twenty qubits!
So, why can't we link and entangle classical bits together in superposition for simultaneous operation in the same sort of way? We can't, mainly, because classical bits are necessarily spaced too far apart: their wavefunctions have no significant overlap. They are deliberately spaced apart so that the operation of one bit doesn't interfere and cause problems with the operation of its neighbor, like the shorting of current from one bit to another without traveling through the connecting circuitry.
Until recently (2015) the transistors (classical bits) and the deposited copper lines (acting as miniscule wires) that are used to electrically control transistor switching in integrated circuits have been thousands of atomic diameters across and separated by comparable distances. The diameter of an atom can be thought of as the distance over which the wavefunction of its lowest-energy (most likely occupied) state drops to essentially zero (where its probability cloud disappears into blackness). So, if the transistors are separated by thousands of atomic diameters and the wavefunctions of the atoms that they are made of essentially disappear over a distance of a few atomic diameters, there is little wavefunction overlap and cooperative behavior.
But, as pointed out earlier, transistors and circuit elements are now being made in sizes just hundreds of atoms across, and continued classical miniaturization according to Moore's law will soon (over the next ten to twenty years) reach a quantum limit as element dimensions approach the sizes of atoms and their wavefunctions begin to substantially overlap. Like it or not, then, continued miniaturization will involve inadvertent quantum connections. Work already started (to develop quantum bits) may be necessary for further progress in miniaturization, and to otherwise increase the data storage capacity and computing power of computer elements.
Quantum qubit properties also allow the formation of new types of logic gates and new and simpler algorithms for new and still faster quantum processing. To get a sense of this, note that one quantum-based algorithm to factor a number n uses the mathematical expression (n2logn), which is far less complex and requires many fewer operations than the classically based factoring algorithm, which uses (exp[n1/3{logn}2/3]).6
One particular logic gate, the CNOT gate (not possible with classical bits), operates by flipping the state of a target qubit only if (for example) a second control qubit is in a 1 (as opposed to a 0) state. What makes this process special in another way is that the two qubits can become entangled in the CNOT operation, linked together to represent any one of their four so-called Bell states,7 regardless of how far the qubits may later become physically separated from each other.
As noted earlier, the power of quantum information storage and processing (e.g., the running of multiple programs simultaneously using the same qubits) derives from the superposition of states within a qubit and the entanglement of two or more qubits. But remember, any outside contact, observation, or measurement on entangled particles or qubits introduces decoherence and breaks the entangled state, as described near the end of Chapter 6. So the quantum computer must be constructed so that it chugs away, doing multiple tasks in superposition without observation or measurement until an answer is found and is then read (observed).
Creating the physical qubit elements, linking them by entanglement, and maintaining the entangled state is not an easy matter. The qubits must be isolated from their surroundings so that outside influences (like the thermal oscillations of atoms normally present in solid materials) will not decohere the quantum entangled state. But they may not be so isolated one from another that they can't be linked by entanglement, and they must still be sufficiently accessible so that their initial states can be manipulated and their final states can be observed and read. Gribbin lists five requirements for qubits, summarized as follows8:
And Gribbin goes on to describe a half dozen approaches being explored for the construction of qubits meeting these criteria, tracing the progress of their development and their use in computers until 2014.9 Note: Here I use terms and concepts that may become more understandable after reading about the foundations of chemistry and materials science in Part Four and the nature of various superconductor and semiconductor devices in Part Five of this book. But this is a preview of coming applications, and so I leap ahead.
The following approaches are being explored toward quantum computing, each with particulars and advantages and disadvantages, as described in Appendix C. As you will come to understand, quantum computing is still an application in the infancy of its development, but one that holds great promise.
QUANTUM TELEPORTATION (IT'S NOT WHAT YOU THINK IT IS.)
Quantum teleportation of a very basic form has already been demonstrated, and I'll soon get on to describing that. But teleportation as envisaged in Star Trek (“Beam me up, Scotty”), although possible in principle, seems highly, highly, highly unlikely. What makes it possible at all is a quantum process, because the constituents of our bodies are all quantum in nature, comprised of atoms and molecules, as described briefly below.
A Refresher on the Quantum Atom
As you have already come to realize from Schrödinger's description of the hydrogen atom, the states of the electron in an atom are diffuse entities, with the position of the electron in the atom described in terms of clouds of probability, as shown for the hydrogen atom in Figure 3.8. Which of the states are occupied by electrons is also based on probabilities. Like most things in nature, every atom seeks a lowest-energy state, a ground state where all of the electrons occupy the lowest energy levels allowed to them, but there is always some probability that they will be in some higher energy state. The same description of states and probabilities applies to molecules, atoms in chemical combination.
While teleportation in the Star Trek sense would require the teleportation of all of the atomic and molecular constituents of the human body, the teleportation of just one atom should suffice to demonstrate the possibility of this actually happening in principle. We'll consider the teleportation of the simplest of atoms, hydrogen. But let's first examine the possibility of teleportation using classical means.
Is It Possible to Achieve Classical Teleportation?
The answer is no. We can't teleport an actual physical object by any known classical method. But we can teleport classical information; we do it all the time, using fax machines. Why not then measure or observe the properties of an object and then fax that information (using phone lines and satellite transmission at the speed of light) so that a copy of the physical object can be constructed far away based on the information that is sent? Surely we should be able to do this quickly on one atom to demonstrate the principle.
Well, there is a problem even with this. The very act of measurement or observation changes the state of the original atom that we want to send. It is the information on this changed state that we then have and then send. Information on the original state of the atom, with all of its probabilities, is not preserved and not available to us.
In short, we have teleported a description of an atom—but not the atom that existed originally. A human constructed of atoms and molecules teleported in this way (though it might have the same numbers of atoms in the same places as in the original) would in fact not be the same person we were intending to teleport.
“But,” you may say, “we have teleported a body!” Sure, but instead of Scotty beaming up a particular person (say, John) by monitoring the state of every atom in John's body, we've caused every atom in him to change, and we've sent the information on the changed person on to Scotty. He constructs the changed person at the other end. It's not the original John that we've teleported.
So How Is Quantum Teleportation Different?
Well, quantum teleportation does not involve the direct measurement or observation of the properties of the particle that we wish to teleport, and so no decoherence or change takes place, at least not initially. Instead, the quantum properties of the particle to be teleported are linked in an entangling operation to the properties of another particle that is already entangled with a third, perhaps faraway, particle. A final operation (guided by the separate transmission of classical information describing how the first operation was performed) then causes the faraway particle, through its entanglement, to take on the properties of the first particle. The first particle is altered in the process (but not before its original properties are linked to those of the faraway particle). The properties of the original particle have thus effectively been teleported to the faraway location. (It is important to note that it is the properties of the original particle that have been teleported, to another already-existing, faraway particle. The original particle itself has not been teleported, has not been moved, but it has had its properties altered in the process.
Such teleportation has already been demonstrated for the case where all of the particles are photons. We examine now this simple case. Then we'll move on to consider the teleportation of ions, atoms, and molecules—the stuff that humans are made of.
How the Quantum Teleportation of Photons Works
Note that all photons can be used as binary qubits, that is, each can be made to exist in either a vertical or a horizontal polarized state or in a superposition of these two states. (For a definition and physical description of polarization, refer to Appendix A and Figure A.1(c).) We might assign a 1 to represent the vertical state and a 0 to represent the horizontal state. Then we're back in the realm of quantum computers and quantum information processing, all of which is formally described as a part of quantum information theory. But so is teleportation.
In one application of the theory, particles having binary states can be teleported through a well-defined process. We illustrate with our photons. In the first step, an entangled pair of photons is created, what is called an ebit, and each still-entangled photon is then sent off to be shared: say particle E1 to Alice and particle E2 to far away Bob.
Sometime much later, in principle, maybe months later, Alice may have a photon that she wishes to teleport to Bob. Let's call it T. This photon is also a qubit polarized in either of two perpendicular directions or in some combination of these two polarized states.
Through the performance of what is called a Bell measurement, E1 becomes entangled with T. This measurement does not decohere or change the state of T, because there is no observation of its polarization; it asks only if the polarizations of the two photons are the same or not, and it entangles them in the process of finding out.
Alice does not know the polarization of T, but she does know the parameters of the Bell measurement that has been performed. She sends Bob a description of the measurement. She does this using classical methods, perhaps using radio or Internet communications that can take place at the speed of light (including the operation of servers, etc., it might take a bit longer).
Bob uses the information that Alice sends to effectively reverse the effects of the Bell measurement and leave E2 in exactly the state that T was in originally. T is left at the end of the process in a different state than it was in originally, but because E2 has T's original properties, the original T has been effectively teleported from Alice to Bob.
“But,” you might say, “Why didn't Alice just send T directly to Bob? T is a photon, so it travels at the speed of light, in fact is light. Then Bob would have the original T at the speed of light without all of the intermediate finagling.”
Alice could, in fact, do that. But we have three purposes in describing photon teleportation. First, something like the teleporting of photons can be used to send encrypted messages in ways that are totally foolproof, as will be described next. Second, quantum computers could use teleportation to connect remotely located internal quantum components, or to communicate directly and securely computer to computer, bypassing the classical stages of supplying input and reading output (essentially eliminating human or classical devices as “middle men”). And, finally, particularly as the teleportation of photons has actually already been tested, we want to use this example with photons as a basis for considering the teleportation of objects with mass that can't travel at the speed of light.
Tests of Photon Teleportation
In 2004, a group under Anton Zelliger successfully teleported photons 600 meters (about 600 yards) through optical fibers from one side of the Danube River to the other side.10 Gribbin also describes two later significant tests, both of which occurred in 2012.11 In the first test, a large group of Chinese researchers managed to teleport “a quantum state” through open air over a distance of 97 kilometers (about 60 miles). In the second test, a team from four European countries teleported “the properties of a photon” some 85 miles at an altitude of nearly 8,000 feet (where the interfering air is thinner) between stations in the Canary Islands, on Las Palmas and Tenerife. If transmission is easier at altitude, why not then transmit first to satellites as relays and then back down? The Chinese envisage eventually creating a secure quantum network constructed in this manner, requiring multitudes of entangled photons. As of 2012, toward this sort of goal, Chinese researchers have been able to produce entangled groups of four photons at the rate of several thousand per second.
Entanglement is the key to teleportation. But for the teleportation of bodies, we'd need to entangle particles of matter, not just photons. And we'll further require an entanglement of the complete set of wavefunctions describing all of the states of each atom or molecule, not just a binary pair of states. Some progress has been made.
The Entanglement of Ions and Teleportation of Electrons
In Computing Quantum Cats, Gribbin describes work at the University of Michigan in which two ions about three feet apart are entangled with each other through the entanglement of their emitted photons.12 And an article in 2014 reports an achievement at the Delft Institute of Technology in the Netherlands, of entanglement between single electrons trapped in supercooled diamonds placed thirty-three feet apart.13
These are major accomplishments, but we clearly have a long way to go, to teleport even single atoms over very long distances. Ultimately to completely teleport a human being requires that a staggering number of atoms be teleported: the order of ten billion billion billion is a good approximation. We might conclude that the teleportation of humans à la Star Trek, although possible in principle, is exceedingly unlikely.
Remember, too, that unlike in Star Trek, we are only teleporting properties to atoms that already exist at a faraway destination. We aren't transporting the atoms themselves. And the atoms that stay behind have changed properties. The faraway atoms now resemble these earlier atoms before they had changed. (This, by the way, is what distinguishes teleportation from cloning. Had the earlier atoms retained their properties, then we would have two identical beings, one of them cloned from the other. But this can't happen in a quantum system. There is a “no cloning” theory, derived from quantum information theory, that shows that copying and cloning are not possible. The original state is always changed, and so one is always left with teleportation. On the other hand, classically one can copy and teleport, but not clone. Interesting!)
But here's food for thought: Orzel notes14 that some scientists (Roger Penrose, for example, in his book The Emperor's New Mind) suggest that consciousness is essentially a quantum phenomenon. If so, Orzel points out, teleportation might be used to transport the contents of our minds. (I would add: if that's the case, do we really need our bodies? What then of mortality?)
ABSOLUTELY SECURE QUANTUM ENCRYPTION
The very ability to copy is what makes classically transmitted information vulnerable to eavesdropping and surveillance, often without the sender and receiver even knowing about it. Even the public-key-encryption methods described earlier are susceptible in principle to decoding because the message sent can be copied. For example, a message can be copied into a bank of computers that operate in parallel toward breaking a code. Classical computers may lack sufficient power to do this with modern public-key encryptions, but soon quantum computers will succeed, as described earlier.
In contrast, quantum information simply cannot be copied by either classical or quantum means. To read or make a copy would physically involve an observation, a measurement. And, as we have learned, an essential feature of our quantum world is that any measurement or observation causes a decoherence of the wavefunction of an object, destroying much of the initial information that the wavefunction contained. What is measured is only a transformed part of the “original,” and the “original” exists no more.
This inability to copy quantum information is at the heart of an ability to encode information in ways that cannot be deciphered by anyone or anything other than the intended receiver, and further cannot even be tampered with without the sender and receiver knowing of it. The principal use of this ability is in what is called Quantum Key Distribution (QKD), described as follows.
Quantum Key Distribution—How It Works
Remember our first example of classical cryptography: Alice sent an encrypted message to Bob using a method called the one-time pad. Bob was able to decipher the message using a key, a string of 1s and 0s sent separately from Alice to Bob beforehand. As long as Bob has the key, he can decode even very long and complicated messages. But sending or distributing the key, especially to a number of designated receivers, can get expensive, and if the key were to fall into the wrong hands, unintended other persons could decipher the messages without the sender or the intended receivers ever knowing about it. One might say that securely sending The Key, is key.
Quantum Key Distribution (QKD) offers foolproof methods for sending the key. One QKD protocol, called BB84 (after Charles Bennett of IBM and Gilles Brassard of the University of Montreal, who together invented it in 1984) uses qubits to transmit a key in a coded way. Another QKD protocol, called E91 (after Artur Ekert, who suggested it in 1991), involves operations on a set of polarized entangled photons, in a manner somewhat similar to that described earlier for photon teleportation. For either protocol, any attempt to intercept the entangled information being sent destroys the entangled state in a way that is detectable by both sender and receiver and makes the key unusable. There is no known method (even in principle) for cracking an encrypted communication made using a one-time quantum key established by either of these methods, and this will hold true even when there are more powerful classical or quantum computers.
Quantum Key Transmission
Secure BB84 keys have been exchanged through optical fiber lengths of 12 miles at rates of a million bits per second, and of 60 miles at rates of ten thousand bits per second. (Typical optical fiber communications are 1,000 to 10,000 times faster. But remember, it is only the key that needs to be transmitted in this secure quantum manner; the rest of the message can be encoded using the key and sent at classical speeds.) The longest quantum key transmission has been achieved with optical fibers through a length of about 100 miles. And the same European group that achieved a teleportation of 80 miles through high-altitude air (between stations at two of the Canary Islands) has similarly managed secure QKD transmission through air over the same distance using both the BB84 and E91 protocols. Since the air is thinner yet at the altitude of satellites, this suggests the possibility of secure longer-distance transmission through satellites as relays.
Quantum Key Distribution Networks
These have been set up in the United States at the Defense Research Projects Agency (DARPA), in Vienna, in Switzerland, and in Tokyo.
Commercial Quantum Key Distribution
As of December 2015, at least four companies are listed as manufacturing quantum cryptography systems: MagiQ Technologies, Inc., of Boston; ID Quantique of Geneva; QuintessenceLabs of Canberra; and SeQureNet of Paris.15 As practical quantum computers that are capable of factoring large products of prime numbers become available (so that classical public-key encryption is no longer safe), these companies should experience an increased demand for their quantum key-encryption services.
CHAPTER 8 SUMMARY
The takeaway points from this chapter are: (1) that quantum computers offer the possibility of computation for some applications that can be millions of times faster than classical computers; (2) that teleportation is not possible by classical means; (3) that teleportation is possible in principle using quantum methods, but, as a practical matter, for much more than single particles it becomes extremely difficult—and for humans highly, highly, highly unlikely; (4) that the teleportation that would result is a teleportation of properties, not particles; (5) that quantum computers are a threat to breaking some of the best of our codes; but (6) quantum encryption offers codes that cannot be broken, and cannot even be tampered with (without the sender and receiver knowing that someone has interfered).
This chapter involved information theory as applied to encryption and teleportation using photons and matter on a quantum atomic scale. In Part Three, Chapter 9, we'll see how the potential loss of information in the evaporation of black holes threatened the theory of quantum mechanics. This as just one small part of our consideration in that chapter of the quantum particles of nature and the quantum's role in the expansion of our universe, from the big bang to the galaxies.