The search algorithm starts by evenly dividing amplitude among the goods at a low level K of the lattice. A convenient choice for binary CSPs is to start at level , where the number of sets is proportional to . Then for each level from K to , we adjust the phases of the states depending on whether they are good or nogood and map to the next level. Thus if represents the amplitude of the set at level j, we have
where is the phase assigned to the set after testing whether it is nogood, and the final inner sum is over all sets that have k items in common with r. That is, when is a good set. For nogoods, when using the phase inversion method, and with uniformly selected from when using the random phase method. Finally we measure the state, obtaining a complete set. This set will be a solution with probability
with the sum over solution sets, depending on the particular problem and method for selecting the phases.
What computational resources are required for this algorithm? The storage requirements are quite modest: N bits can produce a superposition of states, enough to represent all the possible sets in the lattice structure. Since each trial of this algorithm gives a solution only with probability , on average it will need to be repeated times to find a solution. The cost of each trial consists of the time required to construct the initial superposition and then evaluate the mapping on each step from the level K to the solution level L, a total of mappings. Because the initial state consists of sets of size K, there are only a polynomial number of them (i.e., ) and hence the cost to construct the initial superposition will be relatively modest. The mapping from one level to the next will need to be produced by a series of more elementary operations that can be directly implemented in physical devices. Determining the required number of such operations remains an open question, though the particularly simple structure of the matrices should not require involved computations and should also be able to exploit special purpose hardware. At any rate, this mapping is independent of the structure of the problem and its cost does not affect the relative costs of different problem structures. Finally, determining the phases to use for the nogood sets involves testing the sets against the constraints, a relatively rapid operation for NP search problems. Thus to examine how the cost of this search algorithm depends on problem structure, the key quantity is the behavior of .