The number 15 can be obtained by multiplying the number 3 by the number 5; the factors of 15 are 3 and 5. This may not seem a very profound piece of knowledge. But during the early years of the twenty-first century the development of a new kind of computer that could work this out for itself (by ‘factorisation’ of the number 15) marked a giant leap towards practical quantum computing. A genuine quantum computer would be as far in advance of a conventional computer as a conventional computer is in advance of an abacus. But such machines may be no more than twenty years away.
A conventional computer operates by manipulating numbers in binary code, represented as strings of 0s and 1s, which can be thought of as switches that are either on or off. Each switch corresponds to a ‘bit’ of information, and eight-bit ‘words’ are known as bytes. One measure of the power of a computer is the number of bits (or number of bytes) that can be stored and manipulated during calculations. But when we are dealing with quantum entities, such as individual atoms or electrons, the rules are different. A quantum ‘switch’ can be both on and off at the same time, in a so-called superposition of states.
An electron provides an example of such a phenomenon. Electrons have a property called spin, which can be thought of as like an arrow pointing up or down. ‘Up’ might correspond to 0 in binary language, and ‘down’ would then correspond to 1. But in the right circumstances an electron can exist in both states simultaneously (as can some atoms).
The result is known as a quantum bit, or qubit (pronounced like the ancient unit ‘cubit’). And the power of a string of qubits, in computational terms, is the same as the power of a conventional computer with a number of bits equal to 2 raised to the power of the number of qubits. So a computer built from 4 qubits has the power of a conventional computer with 2×2×2×2 = 16 bits, and so on. Such exponentials grow very quickly; a quantum computer containing just 10 qubits would have the power of a conventional computer with 1,024 bits (this is known as a kilobit, because it is very nearly 1000 bits).
Unfortunately, it is very difficult to maintain strings of qubits, manipulate them, and read out data from them. But a start has been made, and the factorisation problem was solved as a result.
Factorisation was chosen as one of the first problems to be tackled because it is of vital importance in manipulating the codes used in security by banks, big business, military uses, and to keep information safe on the internet. Codes used in these areas are based on the properties of very large numbers, hundreds of digits long, which are made by multiplying together two large prime numbers (that is, numbers which cannot themselves be factorized, like 7 or 317, but much bigger). The very large number is used to scramble the message being coded, which can then be unscrambled only by someone who knows these factors. The security comes because it is very difficult to find the factors of a very large number. But what might take years on a conventional computer could be done in minutes on a full-sized quantum computer. The technique for doing this was developed by Peter Shor, of Bell Labs, and is known as Shor’s algorithm.
In 2001, a team of IBM researchers managed to manipulate a system containing molecules each made up of five fluorine atoms and two carbon atoms to act like a 7-qubit computer. This is equivalent to a conventional computer with 27 bits (128 bits). They used this to test Shor’s algorithm, by using the ‘computer’ to work out the factors of 15. You will not be surprised to learn that they got the answer 3 and 5.
Unfortunately, the drawback of this particular technique is that it cannot be scaled up to large numbers of qubits; but it provided clear proof that Shor’s algorithm works and that a scaled-up quantum computer would be a powerful machine that would not only require an overhaul of security systems but could tackle many other problems.
A key breakthrough came in 2012, when a team at the Santa Barbara campus of the University of California (UCSB) ran a version of Shor’s algorithm on a solid state processor containing four superconducting phase qubits, the quantum equivalents of transistors. Like their predecessors, they found the factors of the number 15, the smallest number that Shor’s technique can be applied to. But unlike the IBM work, this really was quantum computing on a chip, and has the potential to be scaled up to something much larger. It won’t be easy, but as Andrew Cleland, one of the UCSB researchers said at the time, ‘the path forward is clear.’