RECITATION NOTES 15-451 Algorithms 09/18/13
- hand back hwk
- go over mini #2
- go over practice quiz
- other problems
=======================================================================
1. hand back hwk. If you want, you can go over questions people have.
2. go over mini #2 (these were both old quiz questions)
3. go over practice quiz if people want, except for Problem 3 which is
on material we didn't do yet [if you want to make more hard copies, it
is in the private directory. I prefer not to have electronic copies
floating around though].
4. Talk about universal and perfect hashing. Give the analysis of
perfect hashing using O(N) space from lecture notes 7.5.2. [Even if I
finish it in lecture, it will probably be too fast, so students will
appreciate seeing it again. Plus it's related to hwk 2]
5. As an alternative to (4) if people need are rusty on probabilistic
analysis, you could do a basic intro to probability and discussion of
linearity of expectation, like this:
A probabilistic setting is defined by a sample space (e.g., if we
throw two dice, then the sample space consists of all 36 outcomes that
the pair of dice can have) and a probability distribution over this
sample space (if these are fair dice, then each outcome has
probability 1/36). Note: we are only talking discrete probabilities
so no measure theory or integrals. An *event* is a subset of the
sample space. E.g., "the event that the dice sum to 3". A *random
variable* is a function over the elements in the sample space. E.g.,
"the sum of the two dice". Events either happen or don't; random
variables are *functions* over what happens. The points in the sample
space are often called "elementary events" (because they are elements
of the sample space and they are events).
The expected value of a random variable X is just its
probability-weighted average:
E[X] = \sum_e Pr(e)*X(e)
where the sum is over all elementary events e.
You can also rearrange to write this as:
E[X] = \sum_a Pr(X = a)*a
where we have grouped all elementary events e in which X(e)=a together
into a single event "X=a".
Linearity of expectation says that for any two random variables X and
Y, E[X+Y] = E[X] + E[Y]. This is an extremely useful fact. It's also
easy to prove from our definitions, like this:
Let Z = X+Y. Remember that Z is a *function* over elementary events,
defined as Z(e) = X(e)+Y(e). So we have:
E[Z] = \sum_e Pr(e)*Z(e)
= \sum_e Pr(e)*(X(e) + Y(e))
= \sum_e Pr(e)*X(e) + \sum_e Pr(e)*Y(e)
= E[X] + E[Y].
Notice that all we really did here was apply the distributive law.
But this simple-to-prove statement is surprisingly powerful.