15-451 Algorithms 12/07/11
recitation notes
Note: Review session: next Wed 1:00pm Wean 7500.
====================================================================
General Review
--------------
What the course has been about:
1. techniques for developing algorithms
2. techniques for analysis
Break down (1) into
(a) fast subroutines: useful tools that go *inside* an algorithm.
E.g., data structures like B-trees, hashing, union-find.
Sorting, selection, DFS.
(b) problems that are important because you can reduce a lot
of *other* problems to them: network flow, linear programming.
Notion of reduction is important in algorithms and in complexity
theory:
- solve a problem by reducing it to something you know how to do.
- show a problem is hard by reducing a known hard problem to it.
Let's organize the algorithms and problems we've discussed in class
along a "running time" line.
sublinear --- linear --- near-linear --- low-degree poly --- general poly(n) --- hard but probably not NP-complete --- NP-complete
sublinear per operation:
- data structures: hashing, balanced search trees, heaps, union find.
linear:
- depth-first search, breadth-first search, topological sorting
- selection/median-finding
near-linear:
- greedy algs with good data structures
- Prim, Kruskal, Dijkstra
- divide-and-conquer algorithms
- sorting, FFTs
low-degree & general poly(n):
- dynamic programming: Bellman-Ford, Floyd-Washall, etc.
- network flow, matchings, min-cost flow
- linear programming
- Graph matrix algorithms
Hard-but-not-probably-not-NP-complete:
- factoring
NP-complete:
- 3-SAT, Vertex cover, Clique, TSP, etc.
----------------------------------------------------------------
Algorithm tools and where they are typically used:
* Dynamic programming: typically for improving exponential time down to
polynomial.
* Reducing to LP, or reducing to network flow: likewise, typically for
improving exponential time down to polynomial.
* Divide-and-conquer: typically for getting from O(n^2) to O(n log n).
Sometimes just for reducing exponent (like Karatsuba or Strassen).
* Data structures: ditto
* Randomization: everywhere.
* Approximation algorithms: typically for NP-complete problems, but
sometimes also makes sense for easier problems too.
----------------------------------------------------------------
Analysis tools:
* Amortized analysis, Potential functions, piggy banks:
Typically for showing O(1) or O(log n) amortized cost per operation
* Reductions: proving NP-completeness by reducing a known NP-complete
problem to it, or giving a poly-time algorithm by reducing to a
problem like LP or network flow.
* Recurrences. Esp with divide-and-conquer
* Linearity of expectation
=====================================