Quantum
Computation
By David Deutsch and Artur Ekert
from the March 1998 issue of Physics World
A new way of harnessing
nature
Many milestones in the history of technology have involved the
discovery of new ways of harnessing nature --- exploiting various physical
resources such as materials, forces, and sources of energy. In the
twentieth century information was added to this list when the invention of
computers allowed complex information processing to be performed outside
human brains. The history of computer technology has itself involved a
sequence of changes from one type of physical realisation to another ---
from gears to relays to valves to transistors, integrated circuits and so
on. Today’s advanced lithographic techniques can etch logic gates and
wires less than a micron across onto the surfaces of silicon chips. Soon
they will yield even smaller components, until we reach the point where
logic gates are so small that they consist of only a few atoms each.
On the scale of human perception and above, classical (non-quantum)
laws of physics are good phenomenological approximations, but on the
atomic scale the laws of quantum mechanics become dominant, and they have
quite a different character. If computers are to continue to become faster
(and therefore smaller), new, quantum technology must replace or
supplement what we have now, but it turns out that such technology can
offer much more than smaller and faster microprocessors. It can support
entirely new modes of computation, with new quantum algorithms that do not
have classical analogues. And more: the quantum theory of computation
plays an even more fundamental role in the scheme of things than its
classical predecessor did, so that anyone who seeks a fundamental
understanding of either physics or information processing must incorporate
its new insights into their world view.
From bits to qubits
What makes quantum computers so different from their classical
counterparts? Let us take a closer look at the basic unit of information:
the bit. From a physical point of view a bit is a two-state system: it can
be prepared in one of two distinguishable states representing two logical
values --- no or yes, false or true, or simply 0 or 1. For example, in
digital computers, the voltage between the plates of a capacitor can
represent a bit of information: a charge on the capacitor denotes 1 and
the absence of charge denotes 0. One bit of information can be also
encoded using, for instance, two different polarisations of light or two
different electronic states of an atom. Now, quantum mechanics tells us
that if a bit can exist in either of two distinguishable states, it can
also exist in coherent superpositions of them. These are further states,
which in general have no classical analogues, in which the atom represents
both values, 0 and 1, simultaneously. To get used to the idea that a
physical quantity can have two values at once, it is helpful to consider
the experiment in Fig. 1a.
 |
Fig.1a. A half-silvered mirror reflects half the
light that impinges upon it. But a single photon doesn’t split: when
we send a photon at such a mirror it is detected, with equal
probability, either at detector A or B. This does not, however, mean
that the photon leaves the mirror in either the horizontal (H) or
the vertical (V) direction at random. In fact the photon takes both
paths at once! This can be demonstrated with the help of a slightly
more complicated experiment shown in Fig.
1b. |
A half—silvered mirror is one that reflects half the light that
impinges upon it, while allowing the remaining half to pass
throughunaffected. Let us aim a single photon at such a mirror, as in
Fig.1a. What happens? One thing we know is that the photon doesn’t split
in two: we can place photodetectors wherever we like in the apparatus,
fire in a photon, and verify that if any of the photodetectors registers a
hit, none of the others do. In particular, if we place a photodetector
behind the mirror in each of the two possible exit beams, the photon is
detected with equal probability at either detector. So does the photon
leave the first mirror in one of the two possible directions, at random?
It does not! It may seem obvious that at the very least, the photon is
either in the transmitted beam H or in the reflected beam V
during any one run of this experiment. But that is not so either. In fact
the photon takes both paths at once, as can be demonstrated with the help
of the apparatus shown in Fig. 1b. Two normal mirrors are placed as so
that both paths intersect at a second half—silvered mirror. With this
setup we can observe the astonishing, purely quantum phenomenon of
single-particle interference.
 |
Fig.1b. Single-particle interference. A photon
which enters the interferometer always strikes detector A and never
detector B. Any explanation which assumes that the photon takes
exactly one path through the interferometer --- either H or V ---
leads to the conclusion that detectors A and B should on average
each fire on half the occasions when the experiment is performed.
But experiment shows
otherwise. |
Suppose that a particular photon followed
the horizontal path marked H in Fig.1b. after striking the mirror. Then
(by comparison with Fig.1a) we should find that the two detectors
registered hits with equal probability. Exactly the same would be observed
if the photon were on the vertical path V. Hence if it were really the
case that the photon takes exactly one path through the apparatus --- no
matter which one --- detectors A and B would on average each fire on half
the occasions when the experiment is performed. However, that is not what
happens. It turns out that in the arrangement shown, the photon always
strikes detector A and never detector B.
The inescapable conclusion is that the photon must, in some sense, have
travelled both routes at once --- for if either of the two paths is
blocked by an absorbing screen, it immediately becomes equally probable
that A or B is struck. In other words, blocking off either of the paths
illuminates B; with both paths open, the photon somehow receives
information that prevents it from reaching B, information that travels
along the other path at the speed of light, bouncing off the mirror,
exactly as a photon would. This property of quantum interference --- that
there appear to be invisible counterparts affecting the motion of
particles that we detect --- applies not only to photons but to all
particles and all physical systems. Thus quantum theory describes an
enormously larger reality than the universe we observe around us. It turns
out that this reality has the approximate structure of multiple variants
of that universe, co-existing and affecting each other only through
interference phenomena --- but for the purposes of this article, all we
need of this ‘parallel universes’ ontology is the fact that what we see as
a single particle is actually only one tiny aspect of a tremendously
complex entity, the rest of which we cannot detect directly. Quantum
computation is all about making the invisible aspects of the particle ---
its counterparts in other universes --- work for us.
 |
Fig.2. A sliver of glass inserted into one
of the two paths in the interferometer can redirect photons from one
detector to another. All photons that enter the top interferometer
strike detector A.. In the bottom interferometer, the interference
is modified by the presence of the sliver of glass on the vertical
path, and as a result all photons end up in detector B. Thus
something that has happened on only one of the paths has, with
certainty, changed the final outcome of the experiment. This effect
is especially useful in quantum
computation. |
One effect that is especially useful in quantum computation can
be demonstrated if we delay the photon on one of the paths H or V. This
can be done by inserting a sliver of glass into that path, as illustrated
in Fig.2. Since the interference between the photon and its invisible
counterpart depends on their exact arrival times, we can, for instance,
choose the thickness of the glass, and hence the delay time, in such a way
that the photon will certainly (i.e. in all universes) emerge at detector
B instead of detector A. Thus something that has happened on only one of
the paths (and hence in only one of the universes) has affected both of
them. We shall return to this point below.
Just as the photon can be in a coherent superposition of being on the
path H and on the path V, any quantum bit, or qubit, can be
prepared in a superposition of its two logical states 0 and 1. That is the
sense in which a qubit can store both 0 and 1 simultaneously, in arbitrary
proportions. But note that just as the photon, if measured, will be
detected on only one of the two paths, so if the qubit is measured, only
one of the two numbers it holds will be detected, at random: not a very
useful property in itself.
But now let us push the idea of superpositions of numbers a little
further. Consider a register composed of three physical bits. A classical
3-bit register can store exactly one of eight different numbers i.e the
register can be in one of the eight possible configurations 000, 001, 010,
..., 111, representing the numbers 0 to 7. But a quantum register composed
of three qubits can simultaneously store up to eight numbers in a quantum
superposition. It is quite remarkable that eight different numbers can be
physically present in the same register; but it should be no more
surprising than the numbers 0 and 1 both being present in the same qubit.
If we add more qubits to the register its capacity for storing quantum
information increases exponentially: four qubits can store 16 different
numbers at once, and in general L qubits can store up to 2L
numbers at once. A 250-qubit register --- essentially made of 250 atoms,
say --- would be capable of holding more numbers simultaneously than there
are atoms in the known universe. (If anything, this understates the
amount of quantum information that they hold, for in general, the elements
of a superposition are present in continuously variable proportions, each
with its own phase angle as well.) Even so, if we measure the register’s
contents, we will see only one of those numbers. However, now we can start
doing some non-trivial quantum computation, for once the register is
prepared in a superposition of many different numbers, we can perform
mathematical operations on all of them at once. For example, if the qubits
are atoms then suitably tuned laser pulses affect their electronic states
and evolve initial superpositions of encoded numbers into different
superpositions. During such an evolution each number in the superposition
is affected, so we are performing a massive parallel computation. Thus a
quantum computer can in a single computational step perform the same
mathematical operation on, say, 2L different input numbers, and
the result will be a superposition of all the corresponding outputs. In
order to accomplish the same task any classical computer has to repeat the
computation 2L times, or has to use 2L different
processors working in parallel. In this way a quantum computer offers an
enormous gain in the use of computational resources such as time and
memory --- though only in certain types of computation.
Quantum
algorithms
What types? As we have said, ordinary information storage is not one of
them, for although the computer now holds all the outcomes of
2L computations, the laws of physics only allow us to see one
of them. However, just as the single answer ‘A’ in the experiment of Fig.2
depends on information that travelled along each of two paths, quantum
interference now allows us to obtain a single, final result that depends
logically on all 2L of the intermediate results. This is how a
remarkable quantum algorithm recently discovered by Lov Grover of
AT&T’s Bell Laboratories in New Jersey achieves the mind-boggling feat
of searching an unsorted list of N items in only square root N or so steps
[1]. Consider, for example, searching for a specific telephone number in a
directory containing a million entries, stored in the computer’s memory in
alphabetical order of names. It is easily proved (and obvious) that no
classical algorithm can improve on the brute-force method of simply
iscanning the entries one by one until the given number is found, which
will, on average, require 500,000 memory accesses. A quantum computer can
examine all the entries simultaneously, in the time of a single access.
However, if it is merely programmed to print out the result at that point,
there is no improvement over the classical algorithm: only one of the
million computational paths (i.e. one in a million universes) would have
checked the entry we are looking for, so there would be a probability of
only one in a million that we would obtain that information if we measured
the computer’s state. But if we leave that quantum information in the
computer, unmeasured, a further quantum operation can cause that
information to affect other paths, just as in the simple interference
experiment described above. In this way the information about the desired
entry is spread, through quantum interference, to more universes. It turns
out that if this interference generating operation is repeated about 1000
times, (in general, square root of N times) the information about which
entry contains the desired number will be accessible to measurement with
probability 0.5 --- i.e. it will have spread to more than half the
universes. Therefore repeating the entire algorithm a few more times will
find the desired entry with a probability overwhelmingly close to 1. In
addition to finding the entry with a given property, variations on
Grover’s algorithm can also find the largest or smallest value in a list,
or the modal value, and so on, so it is a very versatile searching tool.
However, in practice, searching a physical database is unlikely to become
a major application of Grover’s algorithm --- at least so long as
classical memory remains cheaper than quantum memory. For since the
operation of transferring a database from classical to quantum memory
(bits to qubits) would itself require O(N) steps, Grover’s algorithm would
improve search times by at best a constant factor, which could also be
achieved by classical parallel processing. Where Grover’s algorithm would
really come into its own is in algorithmic searches --- that is, searches
of lists that are not stored in memory but are themselves generated on the
fly by a computer program.
For instance, a chess-playing quantum computer could use it to
investigate a trillion possible continuations from a given position in
roughly the number of steps that a classical computer (using blind
‘brute-force’ searching) would need to investigate a mere million. Despite
the greater scope for ‘tree-pruning’ in classical chess-playing
algorithms, this is likely to provide a very significant improvement. As
Gilles Brassard of the Universitéde Montréal has recently pointed out,
another important application of Grover’s algorithm will be in
cryptanalysis, to attack classical cryptographic schemes such as DES (the
Data Encryption Standard)[2]. Cracking DES essentially requires a search
among 256 =7 x1016 possible keys. If these can be
checked at a rate of, say, one million keys per second, a classical
computer would need over a thousand years to discover the correct key
while a quantum computer using Grover’s algorithm would do it in less than
four minutes. By some strange coincidence, several of the superior
features of quantum computers have applications in cryptography. One of
them is Grover’s algorithm. Another is the quantum algorithm discovered in
1994 by Peter Shor, also of AT&T’s Bell Laboratories in New Jersey,
for factorising large integers efficiently [3]. Here the difference in
performance between the quantum and classical algorithms is even more
spectacular.
Mathematicians believe (firmly, though they have not actually proved
it) that in order to factorise a number with N decimal digits, any
classical computer needs a number of steps that grows exponentially with
N: that is to say, adding one extra digit to the number to be factorised
generally multiplies the time required by a fixed factor. Thus, as we
increase the number of digits, the task rapidly becomes intractable. The
largest number that has been factorised as a mathematical challenge, i.e.
a number whose factors were secretly chosen by mathematicians in order to
present a challenge to other mathematicians, had 129 digits. No one can
even conceive of how one might factorise, say, thousand-digit numbers by
classical means; the computation would take many times as long the
estimated age of the universe. In contrast, quantum computers could factor
thousand-digit numbers in a fraction of a second --- and the execution
time would grow only as the cube of the number of digits.
Now, the intractability of factorisation underpins the security of what
are currently the most trusted methods of encryption, in particular of the
RSA (Rivest, Shamir and Adleman) system, which is often used to protect
electronic bank accounts (Public key cryptography was originally
discovered by Ellis and Cocks of the British Government Communication
Headquarters GCHQ). Once a quantum factorisation engine (a special-purpose
quantum computer for factorising large numbers) is built, all such
cryptographic systems will become insecure. Indeed, in one sense they are
already insecure: any RSA-encrypted message that is recorded today will
become readable moments after the first quantum factorisation engine is
switched on, and therefore RSA cannot be used for securely transmitting
any information that will still need to be secret on that happy day.
Admittedly, that day is probably decades away, but can anyone
prove, or give any reliable assurance, that it is? Confidence in
the slowness of technological progress is all that the security of the RSA
system now rests on. What quantum computation takes away with one hand, it
returns, at least partially, with the other. One of the simplest types of
quantum computation—a type which is now routinely carried out in the
laboratory and may soon be a commercial proposition --- is quantum
cryptography [4]. The reason why it is already feasible is that it
requires only one qubit and one quantum computational step at a time. It
provides perfect security because unlike all classical cryptography
(except the one-time pad) it relies on the laws of physics rather than on
ensuring that successful eavesdropping would require excessive
computational effort. Note, however, that existing systems have limited
range.
The potential power of quantum phenomena to perform computations was
first adumbrated in a talk given by Richard Feynman at the First
Conference on the Physics of Computation held at MIT in 1981 [5]. He
observed that it appeared to be impossible in general to simulate a
evolution of a quantum system on a classical computer in an efficient way.
The computer simulation of quantum evolution typically involves an
exponential slowdown in time, compared with the natural evolution,
essentially because the amount of classical information required to
describe the evolving quantum state is exponentially larger than that
required to describe the corresponding classical system with a similar
accuracy. (To predict interference effects, one has to describe all the
system’s exponentially many counterparts in parallel universes.) However,
instead of viewing this intractability as an obstacle, Feynman regarded it
as an opportunity. He pointed out that if it requires that much
computation to work out what will happen in a multi-particle interference
experiment, then the very act of setting up such an experiment and
measuring the outcome is equivalent to performing a complex
computation.
Quantum computation has already been used, in simple cases, to predict
the behaviour of quantum systems. At some point in the foreseeable future,
they will take on a new and irreplaceable role in the structure of
science, for the ability of science to make predictions will then depend
on quantum computation. The foundations of the quantum theory of
computation (which must now be regarded as the theory of computation ---
Turing’s classical theory being only an approximation) were laid down in
1985 when David Deutsch of the University of Oxford published a crucial
theoretical paper in which he described a universal quantum computer [6].
Since then, the hunt has been on for interesting things for quantum
computers to do, and at the same time, for the scientific and
technological advances that could allow us to build quantum computers.
Building quantum
computers
In principle we know how to build a quantum computer; we start with
simple quantum logic gates and connect them up into quantum networks
[7].
A quantum logic gate, like a classical gate, is a very simple computing
device that performs one elementary quantum operation, usually on two
qubits, in a given time. Of course, quantum logic gates differ from their
classical counterparts in that they can create, and perform operations, on
quantum superpositions.
However as the number of quantum gates in a network increases, we
quickly run into some serious practical problems. The more interacting
qubits are involved, the harder it tends to be to engineer the interaction
that would display the quantum interference. Apart from the technical
difficulties of working at single-atom and single-photon scales, one of
the most important problems is that of preventing the surrounding
environment from being affected by the interactions that generate quantum
superpositions. The more components there are, the more likely it is that
quantum information will spread outside the quantum computer and be lost
into the environment, thus spoiling the computation. This process is
called decoherence. Thus our task is to engineer sub-microscopic
systems in which qubits affect each other but not the environment.
Some physicists are pessimistic about the prospects of substantial
further progress in quantum computer technology. They believe that
decoherence will in practice never be reduced to the point where more than
a few consecutive quantum computational steps can be performed. (This,
incidentally, would already allow for some very useful devices--- see the
table below.) Other, more optimistic researchers believe that practical
quantum computers will appear in a matter of years rather than decades. We
tend towards the optimistic end of the scale, partly because theory tells
us that there is now no fundamental obstacle in the way, partly
because of the astonishing talents and problem-solving abilities of the
experimental physicists now working on this project, and partly because
optimism makes things happen.
However, the problems will not be solved in one fell swoop. The current
challenge is not to build a fully-fledged universal quantum computer right
away, but rather to move from the experiments in which we merely observe
quantum phenomena to experiments in which we can control those phenomena
in the necessary ways. Simple quantum logic gates involving two qubits are
being realised in laboratories in Europe and U.S.A. The next decade should
bring control over several qubits and, without any doubt, we shall already
begin to benefit from our new way of harnessing nature. It is known, for
instance, that simple quantum networks can offer better frequency
standards [8]. Some possible milestones in the development of quantum
computer technology are shown in the table below.
Milestones in the development of quantum computer
technology
Type of hardware |
Number of qubits needed |
Number of steps before decoherence |
Status |
Quantum Cryptography |
1 |
1 |
Already implemented |
Entanglement-based quantum cryptography |
2 |
1 |
Demonstrated in lab |
Quantum controlled-NOT gate |
2 |
1 |
Demonstrated in lab |
Composition of quantum gates |
2 |
2 |
Demonstrated in lab |
Deutsch's algorithm |
2 |
3 |
Demonstrated in NMR quantum computer |
Channel capacity doubling |
2 |
2 |
Imminent |
Teleportation |
3 |
2 |
Achieved intermittently |
Entanglement swapping |
4 |
1 |
Achieved intermittently |
Repeating station for quantum cryptography |
A few |
A few |
Theory not yet complete |
Quantum simulations |
A few |
A few |
Simple cases demonstrated |
Grover;s algorithms with toy data |
3+ |
6+ |
Foreseeable |
Ultra-precise frequency standards |
A few |
A few |
Foreseeable |
Entanglement purification |
A few |
A few |
|
Shor's algorithms with toy data |
16+ |
Hundreds+ |
|
Quantum factoring engine |
Hundreds |
Hundreds |
|
Universal quantum computer |
Thousands |
Thousands+ |
|
(the table was compiled in March 1998)
Deeper
implications
When the physics of computation was first investigated systematically
in the 1970s, the main fear was that quantum-mechanical effects might
place fundamental bounds on the accuracy with which physical objects could
realise the properties of bits, logic gates, the composition of
operations, and so on, which appear in the abstract and
mathematicallysophisticated theory of computation. Thus it was feared that
the power and elegance of that theory, its deep concepts such as
computational universality, its deep results such as Turing’s halting
theorem, and the more modern theory of complexity, might all be mere
figments of pure mathematics, not really relevant to anything in
nature.
Those fears have not only been proved groundless by the research we
have been describing, but also, in each case, the underlying aspiration
has been wonderfully vindicated to an extent that no one even dreamed of
just twenty years ago. As we have explained, quantum mechanics, far from
placing limits on what classical computations can be performed in nature,
permits them all, and in addition provides whole new modes of computation,
including algorithms that perform tasks (such as perfectly secure
public-key cryptography) that no classical computer can perform at all. As
far as the elegance of the theory goes, researchers in the field have now
become accustomed to the fact that the real theory of computation hangs
together better, and fits in far more naturally with fundamental theories
in other fields, than its classical approximation could ever have been
expected to. Even at the simplest level, the very word ‘quantum’ means the
same as the word ‘bit’ --- an elementary chunk --- and this reflects the
fact that fully classical physical systems, being subject to the generic
instability known as ‘chaos’, would not support digital computation at all
(so even Turing machines, the theoretical prototype of all classical
computers, were secretly quantum-mechanical all along!). The Church-Turing
hypothesis in the classical theory (that all ‘natural’ models of
computation are essentially equivalent to each other), was never proved.
Its analogue in the quantum theory of computation (the Turing Principle,
that the universal quantum computer can simulate the behaviour of any
finite physical system) was straightforwardly proved in Deutsch’s 1985
paper. A stronger result (also conjectured but never proved in the
classical case), namely that such simulations can always be performed in a
time that is at most a polynomial function of the time taken for the
physical evolution, has since been proved in the quantum case.
Among the many ramifications of quantum computation for apparently
distant fields of study are its implications for both the philosophy and
the practice of mathematical proof. Performing any computation that
provides a definite output is tantamount to proving that the
observed output is one of the possible results of the given computation.
Since we can describe the computer’s operations mathematically, we can
always translate such a proof into the proof of some mathematical theorem.
This was the case classically too, but in the absence of interference
effects it is always possible to note down the steps of the computation,
and thereby produce a proof that satisfies the classical definition: ‘a
sequence of propositions each of which is either an axiom or follows from
earlier propositions in the sequence by the standards rules of inference’.
Now we must leave that definition behind. Henceforward, a proof must be
regarded as a process --- the computation itself, not a record of all its
steps --- for we must accept that in future, quantum computers will prove
theorems by methods that neither a human brain nor any other arbiter will
ever be able to check step-by-step, since if the ‘sequence of
propositions’ corresponding to such a proof were printed out, the paper
would fill the observable universe many times over.
Concluding
remarks
Experimental and theoretical research in quantum computation is now
attracting increasing attention from both academic researchers and
industry worldwide. The idea that nature can be controlled and manipulated
at the quantum level is a powerful stimulus to the imagination of
physicists and engineers. There is almost daily progress in developing
ever more promising technologies for realising quantum computation and new
quantum algorithms with various advantages over their classical
counterparts. There is potential here for truly revolutionary
innovation.
Further
reading
The newly established Centre for
Quantum Computation at the University of Oxford has several WWW pages
and links devoted to quantum computation and cryptography. Some of the
deeper implications of quantum computing are discussed at length in The
Fabric of Reality by David Deutsch (Allen Lane, The Penguin Press,
1997). The discovery of public key cryptography by GCHQ is described on
the CESG web pages.
Bibliography
[1] Grover, L. K. "A Fast Quantum Mechanical Algorithm for Database
Search" Proceedings of the 28th Annual ACM Symposium on the Theory of
Computing, Philadelphia, (1996) pp. 212-219.
[2] Brassard, G., Science vol. 275, p. 627 (1997).
[3] Shor, P. "Algorithms for quantum computation: discrete logarithms
and factoring" Proceedings 35th Annual Symposium on Foundations of
Computer Science, Santa Fe, NM, USA, 20-22 Nov. 1994, IEEE Comput. Soc.
Press (1994) pp. 124-134.
[4]. Wiesner, S. "Conjugate Coding" SIGACT News, Vol. 15, 1983, pp.
78-88; Brassard, G. and Bennett, C.H., Proceedings of the IEEE
International Conference on Computer Systems and Signal Processing, 1984,
p. 175 Ekert, A. "Quantum Cryptography Based on Bell's Theorem" Physical
Review Letters, Vol. 67 1991 pp. 661-663.
[5] Feynman, R. P. "Simulating Physics with Computers" International
Journal of Theoretical Physics, Vol. 21 (1982) pp. 467-488.
[6] Deutsch, D. "Quantum Theory, the Church-Turing Principle, and the
Universal Quantum Computer" Proc. Roy. Soc. Lond. A400 (1985) pp.
97-117.
[7] Deutsch, D. "Quantum computational networks" Proceedings of the
Royal Society of London, Series A (8 Sept.1989) vol.425, no.1868, pp.
73-90.
[8] Huelga, S.F., Macchiavello, C., Pellizzari, T., Ekert, A.K.,
Plenio, M.B., and Cirac, J.I., Physical Review Letters (1998), vol. 79,
3865.
Updated for web publication by Wim van Dam (June 1999)