\( \newcommand{\pmset}{ \{\pm 1\}} \newcommand{\on}{\{\pm 1\}} \newcommand{\cM}{\mathcal{M}} \newcommand{\R}{\mathbb{R}} \)

Pravesh K. Kothari

picture
picture
Assistant Professor (Sep 2019-)
Computer Science Department, CMU
Gates 7105, praveshk AT cs.cmu.edu

Upcoming Talks/Events

SIAM Conference on Optimization [May'23]
Tel Aviv University Theory Fest, Plenary Session [Dec 22]
Workshop on Algorithms Under Uncertainty, Chennai [Dec 22]
Simons Institute Reunion Workshop on CCSI [Dec 22]
Princeton Center for Theoretical Science Workshop on Positivity [Nov 22]
CRM Workshop on Tensors in Algebraic Geometry [Nov'22]
Professional Mentoring Panel at FOCS 2022 [Nov'22]
UT Theory Seminar [Oct' 22]
ACO Seminar UC Irvine [Oct'22]
Berkeley Theory Lunch [Oct'22]
Stanford Theory Lunch [Oct'22]
Santa Fe Working Group on Foundations of Machine Learning [Jul'22]
Fields Workshop on Differential Privacy [Jul'22]
EPFL Workshop on Learning: Optimization and Stochastics [Jun'22]
Dagstuhl Worshop on The Constraint Satisfaction Problem: Complexity and Approximability [May'22]
MIT Theory of Computation Colloquium [April 26, '22]
Harvard Theory Seminar [April 20, '22]
IAS CS/DM Seminar [Feb 28, '22]
"Three Lectures on High-Dimensional Statistical Estimation via Sum-of-Squares", ARC-ACO Lecture Series, Georgia Tech [15-19 Feb'22]
FOCS Workshop on Proof Complexity [7-8 Feb'22]

Research Highlights

See a complete list of my papers for details.
See Selected Papers arranged roughly by topics below with links to videos of recorded talks.

My current research aims at designing efficient algorithms and obtaining rigorous evidence of algorithmic thresholds for average-case computational problems arising in theoretical computer science and its intersections with allied areas such as statistics. Part of this research program is currently supported by an NSF CAREER Award (2021-2026) on The Nature of Average-Case Computation and a Sloan Fellowship (2022).
One highlight of this research program has been the development of sum-of-squares method that brings in ideas from proof complexity to give an "on-demand" design and analysis of semidefinite programming relaxations for hard optimization problems. See my recent monograph Semialgebraic Proofs and Efficient Algorithm Design with Pitassi and Fleming for an introduction and lecture notes from my Fall 2021 CMU class for details.

An exciting recent success of this research program is in resolving the learnability of mixtures of high-dimensional Gaussians where, in a sequence of papers beginning in 2018 we resolved the central open problem in high dimensional robust statistics of finding a polynomial time robust algorithm for learning a mixture of arbitrary Gaussians. The tools developed here (sum-of-squares certificates for analytic properties such as subgaussianity, anti-concentration and hypercontractivity) led to algorithms for a host of basic tasks in algorithmic robust statistics such as linear regression, moment estimation, perturbation resilient algorithms for Sparse PCA and a new blueprint for designing list-decodable learning algorithms that finally resolved the basic question of covariance estimation. This work is currently supported by a Google Research Scholar Award for Efficient Algorithms for Robust Machine Learning. For an overview of how sum-of-squares certificates of basic analytic inequalities play a major role in these developments and see my recent survey talk with key open questions given at the Simons Institute.

This research program has also had some remarkable successes in resolving longstanding open questions in algorithms for random, semirandom and smoothed variants of NP-hard combinatorial optimization problems. This includes a satisfyingly complete picture of the complexity of refuting random constraint satisfaction problems (CSPs) like 3SAT. This involves obtaining tight thresholds for Sum-of-Squares method for refuting random CSPs, new spectral algorithms with surprisingly simple analyses that extend to semi-random instances and settle Feige's Conjecture on the existence of even covers in worst-case hypergraphs and yield a near cubic lower bound for 3-query locally decodable codes. Very recently, we nearly-resolved an open question of Feige and Steinhardt by finding the first polynomial time algorithm for semirandom planted clique that succeeds in recovering planted cliques of size approaching the nearly-optimal bound of sqrt(n) improving on prior works that encountered a "basic SDP" barrier at n^{2/3}. Building on the algorithms from this line of work yields attacks on a couple of proposed constructions of cryptographic pseudorandom generators.

This research program also has also found rigorous evidence for average-case algorithmic thresholds via sharp sum-of-squares lower bounds for average-case problems. One highlight of this work was the discovery of Pseudo-calibration that led to tight SoS thresholds for the Planted Clique problem and uncovered relationships of the SoS algorithm with spectral methods and the low-degree polynomial method.

Current Teaching

15-155: The Computational Lens
A class focused on the use of theory of computation as a lens on other fields of inquiry.
Tue-Thu, 3:30-4:50pm, GHC 4102
Office Hours: after class

Service

Workshop Co-Chair, FOCS 2023 and 2024

PC Member APPROX/RANDOM 2018, SODA 2019, STOC 2020, ITCS 2020 , NeurIPS Area Chair 2020,2021, CCC 2022, RANDOM 2022, FSTTCS 2022, FOCS 2023

Advising

Postdoc: Alperen Ergür (Fall 2019-20, Now Assistant Professor of Math and CS at UT San Antonio),
Mitali Bafna (Fall 2022-23)
PhD Students:
Ainesh Bakshi (co-advised with David Woodruff, graduated 2022, now postdoc at MIT Math)
Tim Hsieh,
Peter Manohar (co-advised with Venkat Guruswami),
Noah Singer (co-advised with Aayush Jain),
Xinyu Wu (co-advised with Ryan O'Donnell, winner of the MSR Ada Lovelace Fellowship),
Jeff Xu (winner of the 2022 Cylab Presidential Fellowship)
Undergraduates: Prashanti Anderson (Alan J Perlis Award for student teaching, runner up to the Alan Newell award for best undergraduate SCS thesis for 2023, will join MIT EECS PhD program in Fall'23),
Misha Ivkov (winner of the Mark Stehlik SCS Alumni Undergraduate Impact Scholarship, theory PhD student at Stanford since Fall'21),
Zachary Sussman (winner of the Alan Newell Award for best undergraduate SCS Thesis)

Some Representative Papers (for all papers, see here)

Algorithms for Learning Mixtures of Gaussians

Algorithms and Thresholds for Semirandom and Smoothed Models

Algorithmic Robust Statistics and Related Topics

Algorithms and Complexity of Unique Games and Related Problems

Average-Case Algorithmic Thresholds

Applications

Funding

My research is generously supported by an NSF CAREER Award (2021), a Google Research Scholar Award (2021), a Alfred P. Sloan Fellowship (2022), and an NSF Medium Grant for collaborative research with Venkat Guruswami (2022).

Mentoring Talks

I recently gave two mentoring talks as part of the new workshops organized by Learning Theory Alliance
Slides from talk on Interacting with your Research Community.
Slides from talk on Thoughts on PhD Admissions.