Algorithms, July 2020 at CIS
- Instructor: David Woodruff
- Lectures: Sat 7-10am Beijing time
- TAs: Handan Luo and Reyna Wu
Design and analyze algorithms! We will put new focus on first and second order methods in this class.
Grading is based on 3 homeworks each worth 10%, an exam worth 20%, a final project worth 40%, and class participation worth 10%
We encourage homework solutions, scribe notes, and final projects to be typeset in LaTeX. If you are not familiar with LaTeX, see this introduction.
- Properties of convex sets. See these slides at CMU for similar material
- More convex geometry and convex functions. See these slides at CMU for similar material
- Gradients, Hessians, first and second order characterizations of convexity (mostly whiteboard).
- Gradient descent and some motivation for second order methods (mostly whiteboard)
- Convergence for gradient descent, projected gradient descent, online gradient descent, strong convexity and faster convergence (mostly whiteboard)
- Coordinate gradient descent, Newton's method, stochastic gradient descent, neural networks, submodular function maximization/minimization (mostly whiteboard)
- Stochastic gradient descent, batches, back propagation, convolutional neural networks
Materials from the following course might be useful in
various parts of this course:
Maintained by David Woodruff