15-451 Algorithms 11/14/12
recitation notes
* Go over quiz
* online algorithms
* more on complexity classes
=================================================================
-> Go over quiz
-> If desired, can go over the online notes on online algorithms
-> If desired, here is some more info on complexity classes.
A little more complexity theory
===============================
Turns out there are a number of interesting complexity classes that
you can define in terms of a polynomial-time algorithm V that takes in
two inputs: an instance I, and an auxiliary string X, where we assume
that the length of X is polynomial in the size of the instance.
NP: A is in NP if there is a polynomial-time algorithm V(I,X) such that:
If "I" is in YES_A, then there exists X such that V(I,X) = YES.
If "I" is in NO_A, then for all X, V(I,X) = NO.
Co-NP: other way around from NP:
If "I" is in YES_A, then for all X, V(I,X) = YES.
If "I" is in NO_A, then there exists X such that V(I,X) = NO.
BPP: (two-sided error randomized polynomial-time)
If "I" is in YES_A, then for at least 2/3 of X, V(I,X) = YES.
If "I" is in NO_A, then for at least 2/3 of X, V(I,X) = NO.
SOME MORE INTERESTING COMPLEXITY CLASSES
========================================
P^{NP} = what you can do in polynomial-time given access to an oracle
for solving circuit-SAT. (You just pay 1 unit for each oracle call).
P^{#P} = what you can do in polynomial-time given access to an oracle
for the problem "Given a circuit C, HOW MANY inputs y have C(y)=1?"
PSPACE = what you can do in polynomial work-space (but possibly
exponential time).
PSPACE contains P^{#P} because we can implement the oracle by just
running the circuit on each of the 2^n different inputs one at a time,
and incrementing a counter every time we get an output of 1. This
takes a long time, but not a lot of work-space.
Canonical PSPACE-complete problem: Does the first player have a
winning strategy in game X (where X is a sufficiently interesting
game, so long as you ensure the length of the game is polynomial).
E.g., "generalized geography": Given a directed graph G and a
starting node v_0. Players take turns picking edges: each edge must
start where the previous edge left off (the first edge must go out
from v_0), and no edge can be picked twice. E.g., think of the nodes
as letters and an edge from i to j as the name of a country that
starts with letter i and ends with letter j. The first player who
can't move loses.
Why is this even in PSPACE? You want to know "does there exist a
move for me such that for all moves for my opponent, there exists a
move for me, etc., so that I win. In polynomial space you can do the
game-tree search, propagating back the values.
In particular, to calculate if state s is a win for me, we just
consider each possible move for me, and recursively calculate whether
the resulting state is a win or tie for my opponent. If there exists
a move for me such that the recursive answer is "no", then s *is* a
winning state for me. Amount of space used is just proportional to
the *depth* of the recursion (the length of the game) since we just
need to remember the path we came from. This is true even if there
may be exponentially many game states.
It's not even known if P = PSPACE!
L = Log-SPACE = what can be done using only O(log n) memory [besides a
read-only input]
NL = nondeterministic log-space. Log-space with a birdie.
E.g., finding if there is a path from A to B in a directed graph G is
in NL. You just store two registers: one is a counter, and one is for
guessing the next step. You try guessing and see if you reach B
before the counter gets to n. If there exists a path, then there
exists a series of guesses that will work. If not, then there doesn't.
This problem in fact turns out to be complete for NL.
It turns out there is also a deterministic algorithm that uses O((log n)^2)
space (but not polynomial time). This is not obvious -- it's Savitch's
theorem which more generally says that anything you can do in
non-deterministic space S can be done deterministically in space O(S^2).