Complexities

We now give a upper bound on the number of consistency checks performed by HAC in the worst-case. Function $Revise(x_i,v_j)$ can be called at most $kd$ times for each dual variable $v_j$, once for every deletion of a value from the domain of $x_i$, where $x_i$ is one of the $k$ original variables constrained with $v_j$. In each call to $Revise(x_i,v_j)$ the algorithm performs at most $d$ checks (one for each value $a\in D(x_i)$) to see if $currentSupport_{x_i,a,v_j}$ is valid (line 17). If $currentSupport_{x_i,a,v_j}$ is not valid, HAC tries to find a new supporting tuple for $a$ in $D(v_j)$. To check if a tuple $\tau$ that contains the assignment $(x_i,a)$ supports $a$ we need to check if $\tau$ is valid. If a tuple is not valid then one of its values has been removed from the domain of the corresponding variable. This means that the tuple has also been removed from the domain of the dual variable. Therefore, checking the validity of a tuple can be done in constant time by looking in the domain of the dual variable. The algorithm only needs to check for support the $d^{k-1}$, at maximum, tuples that contain the assignment $(x_i,a)$. Since HAC stores $currentSupport_{x_i,a,v_j}$, at each call of $Revise(x_i,v_j)$ and for each value $a\in D(x_i)$, it only checks tuples that have not been checked before. In other words, we can check each of the $d^{k-1}$ tuples at most once for each value of $x_i$. So overall, in the worst case, we have $d^{k-1}$ checks plus the $d$ checks to test the validity of the current support. For $kd$ values the upper bound in checks performed by HAC to make one dual variable AC is $O(kd(d+d^{k-1}))$=$O(kd^{k})$. For $e$ dual variables the worst-case complexity bound is $O(ekd^{k})$, which is the same as the complexity of GAC in the non-binary representation.

The asymptotic space complexity of the HAC algorithm is dominated by the $O(ed^k)$ space needed to store the domains of the dual variables. The algorithm also requires $O(nde)$ space to store the current supports. Since the space required grows exponentially with the arity of the constraints, it is reasonable to assume that the HVE (and the other binary encodings) cannot be practical for constraints of large arity, unless the constraints are very tight.

As mentioned, a consistency check in the non-binary representation is done in a different way than in the HVE. Assume that GAC-2001 looks for a support for value $a_i\in D(x_i)$ in constraint $c$, where $vars(c)=\{x_1,\ldots, x_k\}$ and $x_i\in vars(c)$. A tuple $\tau=(a_1,\ldots ,a_k)$ supports $a_i$ if $\tau[x_i]=a_i$ and $\tau$ is valid. To check if $\tau$ is valid, GAC-2001 has to check if values $a_1,\ldots ,a_k$ (except $a_i$) are still in the domains of variables $x_1,\ldots ,x_k$. Therefore, in the worst case, a consistency check by GAC-2001 involves $k-1$ operations. In contrast, HAC checks for the validity of a tuple in constant time by looking in the domain of the corresponding dual variable to see if the tuple is still there. However, this means that the algorithm has to update the (usually) large domains of the dual variables after a value deletion from an original variable. This affects the run times of the algorithms in different problems settings.

In Appendix A we show that HAC does not only have the same complexity, but it also performs exactly the same number of consistency checks as GAC-2001 in arc consistent problems. We also show that in arc inconsistent problems there can be a difference in the number of checks in favor of the HVE.

Nikolaos Samaras 2005-11-09