15-451 Algorithms 02/11/09
RECITATION NOTES
* B-trees and treaps
=======================================================================
B-trees
=======
Have the class work out the result of inserting A,B,C,D,E,F,G,H,I into
an initally empty B-tree with t=2 (i.e., a 2-3-4 tree). For instance,
after the first 3 letters you just have one node. Then, the next
insert brings the B up, and the D goes in to the leaf with the C. E goes
into the leaf, and then F brings up the D. At the very end, the "I"
should cause the root to get split.
Treaps and tree rotations
=========================
* Remind students of the definition: keys are in BST order, and
priority values are in heap order (if v is a child of u, then
priority(v) > priority(u).
* Algorithm: insert new (key,prio) pair as if doing usual BST
insertion, and then do tree rotations (if needed) to bring up the new
node until the heap condition is satisfied.
* Do an example. You can have students make up (key, priority) pairs
and then insert them in and do the rotations. Remind students of how
tree rotations work.
* Intuition to keep in mind is that tree will look *as if* we inserted
in order of increasing priority. So, if we set priorties at random,
this will look just like the recursion tree of randomized quicksort.
* See if class can put together argument for why the rotation process
is guaranteed to restore the heap property. The induction
hypothesis is that all ancestor-descendant relations are satisfied
(priority(ancestor) < priority(descendant)) *except* possibly for
the case where the descendant is the newly-inserted node. Prove that
this is maintained after each rotation. Why does this then imply
what we want? [because each rotation brings the new node closer to the
root]
* The lecture notes contain an argument showing that if nodes are
given random priority values, then the treap will have depth O(log n)
with high probability. We didn't get to this in class though. If you
have time, it would be great if you can do it in recitation. It's a
bit tricky --- see the lecture notes for details. Note that
this is a stonger statement than what we proved in class about
randomized quicksort (that the expected total number of comparisons is
O(n log n), which just implies that the expected *average* depth of
the nodes is O(log n)).