Summaries of earlier lectures

Num
| Date |
Summary |
---|---|---|

24 | 14.Nov | Today we considered some techniques from Differential
Geometry for analyzing smooth curves in 3D Euclidean space.
We defined the Frenet Frame Field, |

25 | 19.Nov | We began our discussion of Markov chains today. We
defined the semantics of the stochastic matrix We observed that We classified states as We stated that all states in a recurrent class have the same type. We observed that a recurrent class may always be thought of as a Markov chain in its own right. We observed that in a finite chain not all states can be transient and that no persistent state can be a null state. A Markov chain that is irreducible and aperiodic has a
For a general Markov chain, there may be several dimensions
of such left eigenvectors, one for each recurrent class. The
left eigenvector associated with a recurrent class describes
the stationary distribution for that recurrent class. The
stochastic matrix of a Markov chain Finally, we considered some examples of periodic chains with one or more recurrent classes. We observed that the eigenvalues of the stochastic matrix contain roots of unity, reflecting the periodicity of its recurrent classes. |

26 | 21.Nov | We applied our discussion of Markov Chains by a look at some gambling problems. First we looked at the We discussed card shuffling and how perfect riffle shuffles create periodic subchains, without actually randomizing the cards. We observed that the stochastic matrix for any shuffling algorithm is doubly stochastic, so long as the shuffling is based simply on physical rearrangements. Consequently, the uniform distribution is a stationary distribution. If riffle shuffles have stochastic errors in them, then shuffling will converge to the uniform distribution in roughly 8 shuffles, for a 52 card deck. On the other hand, 8 perfect riffle shuffles will simply reproduce the cards in their initial order. |

27 | 26.Nov | We had two guest lectures today, given by two of our teaching assistants. Jaynanth Mogali spoke about projections of convex polyhedra onto subspaces. The lecture covered Fourier Elimination in detail, mentioned Farkas' Lemma, and described applications in combinatorial optimization. Sha Yi spoke about matchings in graphs. The lecture covered basic definitions and the Hopcroft-Karp algorithm in detail, then mentioned the Hungarian Algorithm, Edmund's Blossom Algorithm, and approaches using Integer Linear Programming. |

Back to the webpage for 16-811