Memoization may be viewed as a mechanism for re-using a computation---if a
function is re-applied to the same argument we may re-use the previous
computation to determine the result, rather than perform it again.
Conventional memoization is \emph{accurate} in the sense that it only permits
re-use when a computation will be precisely the same as one that has already
been performed. But in many cases a computation may be largely, though not
entirely, the same as one that has been previously carried out for a slightly
different input. The previous computation may be re-used by permitting
\emph{inaccurate} memoization, and restoring accurancy by \emph{adapting} the
result to the variant input. This technique, which we call \emph{adaptive
memoization}, greatly increases the effectiveness of both memoization and
adaptivity alone (or in orthogonal combination) for incremental computation.
In particular we obtain an incremental version of Quicksort that adjusts its
output in logarithmic expected time to insertions and deletions at random
positions in the input, and an incremental version of Insertion Sort that
adjusts its output in linear time to an insertion or deletion anywhere in its
input. These results are the best possible for these algorithms, and rely
crucially on adaptive memoization.