next up previous
Next: Outline of the Algorithm Up: Discussion of Paper by Previous: Discussion of Paper by

Background: Quantum Complexity Theory.

Ever since Deutsch introduced his quantum generalization of the machine [16], researchers have wondered whether this computational model has computational capabilities greater than those of classical Turing machines. In his original paper, Deutsch showed that quantum computers could exploit ``quantum parallelism'' to simultaneously compute function values for N inputs using only one mechanism. This works because the unitary transforms that apply to quantum states operate on the amplitudes of all the possible classical states of the system simultaneously. Thus, a single unitary transformation that implements the transition function f of a computation can simultaneously take the state tex2html_wrap_inline343 to tex2html_wrap_inline345 for all i in some index of the N initial states.

Unfortunately, this parallelism does not effectively allow one to do N times as much work with it as without it, because the N results of one computation cannot all be measured, since (as described in the previous section) communication of state information to the outside world effectively isolates the possible values of the measured state variable from each other, and so effectively causes states inconsistent with the measured information to have nonzero amplitude. The only way that information about the amplitudes of different mutually exclusive states can be combined is by taking unitary, linear transformations of those amplitudes before measuring state information.

However, for certain problems, such unitary combinations of the amplitudes of different states may provide information useful for solving the problem. Deutsch hinted at this in his original paper, showing that the XOR of the values of a boolean function on two different inputs could be computed in the time needed to evaluate the function once, if a certain transformation of the result states was performed before measuring them. The catch is that half the time, at random, the measurement yields no information--so the expected rate of finding these XORs is the same as with a classical algorithm that first computed the value for one input, and then the other, and XORed them.

In a later paper [19], Deutsch did much better--together with Jozsa he showed that a certain property of functions could be determined with certainty exponentially faster by quantum programs than by classical ones, if the function is given as a black box as input to the program. The property (for a function that returns 0 or 1) is whether the function is variable (it has value 0 for some inputs, and 1 for some inputs), or biased (it has one value for more inputs than the other value) if we are given that it is not both. If, on the contrary, it is both--say if its value is 1 on two-thirds of the inputs--the quantum algorithm may return either answer.

Unfortunately, it is hard to think of a realistic scenario where such an ability might be useful. For example, if the given function is a simple boolean formula applied to its input bits, we may be interested in knowing whether the function is variable (SAT problem), but if it is, who cares whether it is biased! Unfortunately, if the function is highly biased, as is generally the case for hard SAT problems, then the algorithm will almost always answer ``biased'' instead of ``variable,'' giving us no help with the satisfiability question. A classical algorithm could do just as well on SAT by trying input assignments at random. The Deutsch-Jozsa algorithm is really only helpful if we know that the input function is either constant or unbiased, and we cannot tolerate any non-zero probability of failure in determining which one it is. This seems like an unnatural problem.

But in any case, following the Deutsch-Jozsa paper, analysis of the power of quantum computers developed rapidly with papers [8, 7, 6] that defined various quantum complexity classes and compared them with various classical complexity classes in relativized oracle settings similar to Deutsch and Jozsa's. Quantum operations were also found to have uses in implementing various cryptographic operations; see the end of [8] for a summary. Quantum analogues to the popular classical complexity classes such as BPP\ (bounded-error probabilistic polynomial-time) and ZPP (zero-error probabilistic polynomial-time) were defined, and various of the quantum classes were shown to be larger than the various classical classes--but only in relativized oracle settings.

However, none of the oracle problems addressed seemed particularly evocative of real problems until Simon's [39] paper, which showed that the following problem was in ZQP (zero-error quantum polynomial-time): We are given a function f, and told that either f is 1-to-1, or else it is 1-to-2 and there is some bit-mask s such that tex2html_wrap_inline361 (where tex2html_wrap_inline363 is bitwise exclusive-or) for all input bit-patterns x. The problem is to determine whether the former or the latter is true, and if the latter, to find s. This seems a better than the earlier problems because it actually returns a significant amount of information about its input in the form of finding the XOR-mask s (if it exists). Anyway, Simon showed that quantum computers could solve this problem with certainty using a polynomial number of queries of the input function. Classical algorithms require exponentially many tries to achieve a reasonable probability of success.

The extraordinary thing about Simon's construction was its use of a particular unitary transformation equivalent to a special-case of the discrete Fourier transformation that had been introduced earlier by Bernstein and Vazirani [6]. Originally this Fourier transform was used to solve a certain simple oracle problem using tex2html_wrap_inline371 queries on a quantum computer as opposed to the tex2html_wrap_inline373 queries that were classically required. The Fourier transform is linear and invertible; it turns out that it is unitary as well, and a discrete Fourier transform on functions of n-bit inputs can be performed on a quantum in time polynomial in n using a recursive procedure related to the classical fast-Fourier-transform (FFT) algorithm [28].

Simon's ingenious use of the quantum Fourier-transform algorithm to reduce an exponentially-hard problem to polynomial time was the original inspiration for Shor's application of a more general version of the transform to a difficult and practical problem: factoring large integers. We now summarize Shor's algorithm.


next up previous
Next: Outline of the Algorithm Up: Discussion of Paper by Previous: Discussion of Paper by

Fischer Randy
Fri Mar 8 18:50:13 EST 1996