15-850: Advanced Algorithms, Fall 2018
- Lecturer: Anupam
Gupta, GHC 7203.
- TAs: Jason Li, Goran Zuzic
- Office hours: Anupam: Tue 4-5 p.m. in GHC 7203, Jason: Thu 10–11 a.m. in GHC 7607, Goran: Wed 5–6 p.m. in GHC 7515
- Location: GHC 4211, MWF 1:30-2:50 (note: 3 days a week)
- Blog: https://cmualgos.github.io/
- Piazza: here
- Gradescope: Use course code 92K6XG to enroll.
Announcements
- All announcements will be on Piazza, please make sure you're
signed up there!
- A
preliminary schedule
for the lectures. Please sign up to scribe!
Homeworks
Lectures
- Week #1: Spanning Trees
- Lecture 1. (Aug 27) MSTs. Recap Prim/Kruskal/Boruvka. The
Fredman-Tarjan \(O(m \log^\star n)\) time algorithm. The
Karger-Klein-Tarjan randomized algorithm (my
notes det, rand, prelim
scribe notes)
Deterministic MSTs:
Randomized MSTs:
- Lecture 2. Directed MSTs (aka Arborescences and
Branchings). LP approaches. (my
notes, my LP
notes, Corwin's scribe
notes, slides, blog
post)
References on arborescences:
References on basic LPs and duality:
- Lecture 3. Dynamic Graph Connectivity
algorithms. Frederickson, Kapron et al., Holm et
al. (my
notes, Justin's and Anish's
unedited scribe notes,
blog post)
Worst-case update times:
- Greg
Frederickson, Data
Structures for On-Line Updating of Minimum Spanning Trees,
with Applications,
1985. (and slides
by Ran Duan)
- Kapron, King, and
Mountjoy, Dynamic
graph connectivity in polylogarithmic worst case time,
SODA
2013. (and slides)
See also: a SODA 2012 paper
of Ahn,
Guha, and McGregor with a similar sketching idea.
- Recent progress on worst-case dynamic MST by Nanongkai and
Saranurak and Wulff-Nilsen, FOCS 2017.
And for amortized update times which we did not go over:
- Week #2: Shortest-Path Trees
- Lecture 4: Shortest Paths. Seidel's algorithm for
(unweighted undirected) APSP using matrix multiplication.
(my
notes, Alireza's scribe
notes, slides, blog
post)
- Notes
from 210, 451,
and Jeff Erickson
(SSSP, APSP),
covering Dijkstra, Bellman-Ford-(Moore), Floyd-Warshall, and
Johnson's algorithm.
- R. Seidel. On
the all-pairs-shortest-path problem, STOC 1992.
- Alon, Galil, Margalit,
Naor. Witnesses
for Boolean Matrix Multiplication and for Shortest Paths,
FOCS 92.
- M. Fredman, New
Bounds on the Complexity of the Shortest Path Problem,
SICOMP.
And some newer results which we won't have time to cover:
- Ryan
Williams' algorithm
in time \(n^3/\exp(\sqrt{\log n})\), 2014.
(my
notes, scribe
notes)
- What if APSP cannot be solved in \(n^{3-\varepsilon}\) time:
a survey
by Virginia Vassilevska Williams for the Math Congress,
2018.
- Lecture 5: Low-stretch Trees. Bartal's
construction. [Jason guest-lecturing!]
(Anupam's
notes, Roie's scribe
notes, blog
post)
- Lecture
notes from our randomized algorithms course
(S11).
- Alon, Karp, Peleg,
West: A
Graph-Theoretic Game and its Application to the k-Server
Problem, SICOMP.
- Bartal: Probabilistic
Approximation of Metric Spaces and its Algorithmic
Applications, FOCS 96
Our presentation pretty
much followed his general construction.
- Fakcharoenphol, Rao,
Talwar, A
tight bound on approximating arbitrary metrics by tree
metrics, STOC 03.
This paper gives the best
possible algorithm for the metric graph case.
- Abraham,
Neiman: Using
Petal-Decompositions to Build a Low Stretch Spanning
Tree.
This paper gives the current best algorithm
for the general graph case.
Some other low-diameter decomposition schemes:
And applications of low-stretch spanning trees to some problems:
- Week #3: Matchings
- Lecture 6: Matchings in Graphs: combinatorial algorithms
(my
notes, (unedited) Vy's
scribe notes)
- Lecture 7: Matchings in Graphs: Linear Programs and
Integrality of Polytopes (my
LP notes, my matching
polytope notes, (unedited)
Greg's scribe
notes, blog
post)
- Notes
from our LP/SDP course on the integrality of the
non-bipartite perfect matching polytope.
- Edmonds: Maximum
matching and a polyhedron with 0,1 vertices.
This gives a combinatorial algorithm to find a max-weight
matching in general graphs, which we did not see. Of
course, you can just solve the LP to find a vertex
solution, and it will be integral.
- Lecture 8: Matchings in Graphs: Algebraic Algorithms
(my
notes, old scribe
notes, blog
post)
And other papers that we discussed in passing:
- Week #4: Concentration of Measure and Dimension Reduction Techniques
- Lecture 9 (9/17): Concentration of Measure:
"Chernoff-Hoeffding" Bounds.
(my
notes, (unedited) Sarthak's scribe
notes)
- A gentle
introduction to the probabilistic method
by Jiri
Matousek
and Jan
Vondrak
- Notes on Chernoff bounds from our randomized algorithms
course
(S11, S04).
- Need some advanced concentration inequalities? Look at these
notes
by Colin
McDiarmid,
or Gabor
Lugosi.
- Or these fine books
by Alon
&
Spencer, Boucheron,
Lugosi,
Massart, Dubhashi
&
Panconesi, Motwani
&
Raghavan, Mitzenmacher
&
Upfal,
Janson,
Luczak, Rucinski,
and Vershynin
- Notes
from Pascal Massart's St. Flour lectures, and some notes
by David
Pollard.
- David Wajc's notes and Robin Pemantle's talk on negative association and related topics.
And some history:
- Lecture 10: (9/19) Dimension Reduction:
Johnson-Lindenstrauss and related topics
(my
notes, old
scribe notes)
And stuff we will not be able to cover:
- Lecture 11 (9/21): Dimension Reduction: Singular Value Decompositions
(my notes, old
scribe notes)
- Week #5: Fixed-Parameter Algorithms and More Large-Deviation Bounds
- Lecture 12: (9/24) FPT Algorithms: Color-Coding, Branching,
Kernelization. [Jason guest-lecturing!]
(Jason's
notes, (unedited) David,
George, and Mounira's scribe notes)
- Well-written, thorough introduction to FPT algorithms: Cygan et al. textbook. We covered Sections 2.2.1, 3.1, 5.2.1.
- Lecture 13: (9/26) Algorithmic Graph Theory: Treewidth and
Planarity [Jason guest-lecturing!]
(Jason's notes,
(unedited) Benjamin and Yao Chong's scribe notes)
- Well-written introduction to treewidth: Cygan et al. textbook, Chapter 7
- Computing treewidth: a simple 4-approximation to treewidth: Chapter 7.6 of the textbook
- Baker's technique: Section 11.3 of Jeff Erickson's notes
- Lecture 14: (9/28) Applications of Concentration
Inequalities [Goran guest-lecturing!]
(Goran's notes,
blog
post,
Ziye and Tai's scribe notes)
- Week #6: Online Learning and Using it to Solve LPs
- Week #7: Convex Optimization: First-Order Methods and Connections to Online
Learning
- Week #8: Linear and Convex Optimization: Polynomial-time Algorithms
- Week #9: Approximation and Online Algorithms
- Week #10: Listeners' Choice Week
- Lecture 26 (10/29): Approximation Algorithms using
Semidefinite Programming
(Alex's and Pedro's (unedited) notes,
my notes)
And more background on SDPs and their duality:
- Lecture 27 (10/31): Semidefinite Programming: Duality
and SOS
(my
notes, blog post)
- Please see the references for lecture 26; more to come.
- Lecture 28 (11/02): Decision-making under
uncertainty: the secretary and the prophet inequality
problems
(blog
post)
- A short
introduction
to secretary problems and
prophet inequalities; from the IPCO 2017 summer school.
Description
An intensive graduate course on the design and analysis of
algorithms, picking up from where 15-451 stopped. It will be more
specialized and in-depth than the Graduate Algorithms course 15-750.
We will cover advanced algorithmic ideas (some classical, some very
recent), and the theory behind it (theorems, proofs). For topics that were
covered in 451/750, the goal is to cover advanced content and new
techniques.
Syllabus
The current list of potential topics are:
- MSTs. Recap Prim/Kruskal/Boruvka, an \(O(m \log^\star n)\) time
algorithm, a randomized linear-time algorithm, MSTs in directed graphs
- Shortest paths. recall Dijkstra/Bellman-Ford/etc, Seidel's
algorithm using matrix multiplication, Williams' subcubic APSP
algorithm for general graphs.
- Matchings: classical matchings using augmenting paths,
matching using matrix inversions (Tutte/Lovasz, Mucha-Sankowski),
matchings using random walks.
- The Experts algorithm via multiplicative weights, online gradient
descent.
- Flows: max-flows via electrical flows, Sherman's near-linear time max-flow.
- Linear and Convex Programming
- Solving convex optimization problems via gradient descent,
ellipsoid, interior-point methods
- Using convex optimization to solve combinatorial
problems (maybe)
- High-dimensional geometry: Dimension reduction and singular value
decompositions.
- Fixed-parameter tractable algorithms.
- Streaming algorithms (a.k.a. algorithms for big data)
- Online algorithms
- Approximation Algorithms
- The Sums-of-Squares Paradigm
- Submodular Optimization
Here are the
Spr
17
and Spr
15 versions of this course. The topics this year will be similar
(though not identical). I will take a poll in mid-Oct about topics to
cover in the last 4-5 lectures.
Effort and Criteria for Evaluation:
We have 3 lectures a week for the first ~10 weeks of the
semester. Attendance is required. You will scribe 1-2 lectures in Latex:
either you will polish/augment old lecture notes, or will write new
notes. We have ~7 HWs (including HW0), half solved with collaboration,
others solved individually. Your grade will depend on these HWs, scribe
notes, class participation, and attendance.
Textbook:
There is no textbook for the course. Most of our material is not in
textbooks (yet), so we will rely on papers, monographs, and lecture
notes.
Prerequisites:
You should know material in standard UG courses in algorithms,
probability, and linear algebra very well. Above all, mathematical
maturity/curiosity and problem-solving skills are a must. That way you
can make up for gaps in your knowledge yourself. It is natural to not
know everything, or for you to get rusty on concepts, but you should
learn how to learn. This may seem vague, so here are concrete
things.
- You should know material from an undergraduate Algorithms
course, say until Lecture 16 (P/NP) from
our Spring
2018 version of 15-451. You should definitely know DP, flows,
basics of linear programming, duality, etc. If not, please pick these
up from the 451 lecture notes/textbooks/Wikipedia/Khan Academy.
- Same for undergraduate probability and linear algebra
courses. E.g., for LinAlg, see whether you can solve
the first
homework from Steve Vavasis' Optimization course at Waterloo. For
probability (and randomized algorithms), I will give some problems in
HW0.
- HW0 will make an attempt to test this preparedness, and to give
you and us feedback. Students late in submitting HW0 will risk being
dropped from the course (we are over-subscribed, and have a wait-list),
sorry! So talk to us in Lecture #1 if you have a good reason for being
late.
FAQs:
- Q: Will you give the real-world context for the algorithms you
cover?
This class will focus on the theoretical side, so we will not have much time to motivate the choice of topics
or give much real-world context (of which there is plenty). If you are
looking for the latter, you should consider
taking 15-853: Algos in the Real World.
- Q: I am on the wait-list: will I get in?
We hope everyone who wants to take the class will get in. (Usually
some students are "shopping around" in the first couple weeks, and
things settle down after that.) Just come to the lectures, and we will
try to get clear the wait-lists soon. OTOH, if you are on the wait-list
because you have too many credits, or are enrolled for a course with a
conflicting time-slot, you should fix those issues yourself.
- Q: I am struggling with the homework: what should I do?
A little struggle in doing the homework is expected (and
healthy), since these are not trivial problems, you're probably
learning something new.
However, if
you are struggling a fair bit, or if the struggle is unpleasant for you,
come talk to us. You may consider
taking 15-451 (UG Algos) or 15-750 (Graduate Algos) instead of this
course.
Your Health & Well-being
This is an intensive (and somewhat intense) course: 3 days a
week, studying new/advanced topics, several HWs. One major goal is that you will
have fun taking it and learning this new cool material, as much as I hope to have fun teaching it. Part of
making sure you have fun involves taking care of yourself. Do your best to maintain a healthy
lifestyle this semester by eating well, exercising, avoiding drugs and
alcohol, getting enough sleep and taking some time to relax. This will
help you achieve your goals and cope with stress.
If you or anyone you know experiences any academic stress, difficult
life events, or feelings like anxiety or depression, we strongly
encourage you to seek support. Counseling and Psychological Services
(CaPS) is here to help: call 412-268-2922 and visit
http://www.cmu.edu/counseling/. Consider reaching out to a friend,
faculty or family member you trust for help getting connected to the
support that can help.
Accommodations for Students with Disabilities:
If you have a disability and are registered with the Office of
Disability Resources, I encourage you to use their online system to
notify me of your accommodations and come and see me to
discuss your needs with me as early in the semester as possible. Since
this is a fast-moving course, we should discuss how to make things work
out. I will work with you to ensure that accommodations are provided as
appropriate. If you suspect that you may have a disability and would
benefit from accommodations but are not yet registered with the Office
of Disability Resources, I encourage you to contact them at
access@andrew.cmu.edu.
Academic Integrity
Honesty and transparency are important features of good scholarship. On
the flip side, plagiarism and cheating are serious academic offenses
with serious consequences. If you are discovered engaging in either
behavior in this course, you will earn a failing grade on the assignment
in question, and further disciplinary action may be taken. For a clear
description of what counts as plagiarism, cheating, and/or the use of
unauthorized sources, please see
the University
Policy on Academic Integrity and the Carnegie Mellon Code on Academic Integrity.
Maintained by Anupam Gupta