Recall the definition of Big-O notation:
T(n) = O(f(n)) if there exist constants c > 0 and n_0 such that T(n) <= c f(n) for all n > n_0.For example, we can say that 2n² + 2n + 1 is O(n²). We can also say that it is O(n³), if we'd like, though this isn't as informative.
Computer scientists have developed another, similar notations over the years to deal with the lower bound of functions, called big-Omega notation:
T(n) = Omega(f(n)) if there exist constants c > 0 and n_0 such that T(n) >= cf(n) for all n > n_0.As an example of using this notation, 2n² + 2n + 1 is Omega(n²). It's also Omega(n), but, again, that isn't as informative.
Finally, we have a way of giving specific information about a function called Theta notation:
T(n) = Theta(f(n)) if there exist constants c_1, c_2 > 0 and n_0 such that c_1*f(n) <= T(n) <= c_2*f(n) for all n > n_0This way we say that 2n² + 2n + 1 is Theta(n²), which implies that it is both O(n²) and Omega(n²).
One way to think about this is in terms of limits (though this is a simplification). Consider the limit of the ratio of T(n) with f(n):
/ inf : T(n) = Omega(f(n))
lim T(n) / f(n) == { c > 0 : T(n) = Theta(f(n)) = Omega(f(n) = O(f(n))
n->inf \ 0 : T(n) = O(f(n))
(If the limit does not exist, it's still possible
that T(n) = O(f(n)), or T(n) = Omega(f(n)), or even T(n) = Theta(f(n)),
e.g., T(n) = n sin n is Theta(n).)
Often we want to analyze worst-case time complexity of an algorithm. We can define the worst-case time T(n) to be the longest time it takes on any input of size <= n.
As an example, let's look at a code fragment for searching for an integer X in a linked list of n ints:
for(ptr = list; ptr != NULL; ptr = ptr->next) {
if(ptr->value == X) return 1;
}
return 0;
The worst case time of this code is O(n), even though it may frequently
do much better. (It's also Theta(n).)
Now let's look at another piece of code:
for(i=0; i < n; ++i) {
for(j=0; j < n; ++j) {
// constant time stuff...
}
}
This code runs in O(n²) time.
In preparing the assignment, we had to write a program to compute the maze from a dictionary of n 5-letter words. Suppose that we did it by comparing all word pairs to see if they differ in exactly one character. Then the code would look like this:
for(i=0; i < n; i++) {
for(j=i+1; j < n; j++) {
if(word[i] differs from word[j] in one character) {
add (i, j) to the list of edges;
add (j, i) to the list of edges;
}
}
}
The first time through the outer loop, we go through n - 1 words; the
second time we go through n - 2 words; and so on until the nth time
through the loop we go through 0 words. So the amount of time spent is:
T(n) = (n - 1) + (n - 2) + (n - 3) + ... + 3 + 2 + 1 = n(n - 1) / 2So the time for this algorithm is O(n²).
We didn't actually do this, since this O(n²) algorithm would have taken quite a while to handle the 4000 words in the bigger maze. Try to think of an algorithm that beats it. (Our method was actually about O(n).)