\( \newcommand{\pmset}{ \{\pm 1\}} \newcommand{\on}{\{\pm 1\}} \newcommand{\cM}{\mathcal{M}} \newcommand{\R}{\mathbb{R}} \)

Pravesh K. Kothari

picture
picture
Assistant Professor (Sep 2019-)
Computer Science Department, CMU
Gates 7105, praveshk AT cs.cmu.edu

Upcoming Talks

EPFL Workshop on Learning: Optimization and Stochastics [Jun'22]
Dagstuhl Worshop on The Constraint Satisfaction Problem: Complexity and Approximability [May'22]
MIT Theory of Computation Colloquium [April 26, '22]
Harvard Theory Seminar [April 20, '22]
IAS CS/DM Seminar [Feb 28, '22]
"Three Lectures on High-Dimensional Statistical Estimation via Sum-of-Squares", ARC-ACO Lecture Series, Georgia Tech [15-19 Feb'22]
FOCS Workshop on Proof Complexity [7-8 Feb'22]

Research Highlights

My current research aims at designing efficient algorithms and obtaining rigorous evidence of algorithmic thresholds for average-case computational problems arising in theoretical computer science and its intersections with allied areas such as statistics. Part of this research program is currently supported by an NSF CAREER Award (2021-2026) on The Nature of Average-Case Computation and a Sloan Fellowship (2022). One highlight of this research program has been the development of sum-of-squares method that brings in ideas from proof complexity to the design of efficient algorithms. See my recent monograph Semialgebraic Proofs and Efficient Algorithm Design with Pitassi and Fleming for an introduction and lecture notes from my Fall 2021 CMU class for details.

An exciting recent success of this research program is improved efficient algorithms for learning mixtures of high-dimensional Gaussians in a sequence of papers beginning in 2018 that resolves an open question that Santosh Vempala asked in the wake of the 2010 breakthrough on learning Gaussian mixtures from i.i.d. samples. The tools developed here (sum-of-squares certificates for analytic properties such as subgaussianity, anti-concentration and hypercontractivity) also led to algorithms for basic tasks in algorithmic robust statistics such as linear regression, moment estimation, perturbation resilient algorithms for Sparse PCA and a new blueprint for designing list-decodable learning algorithms that finally resolved the basic question of covariance estimation. This work is currently supported by a Google Research Scholar Award for Efficient Algorithms for Robust Machine Learning. For an overview of how sum-of-squares certificates of basic analytic inequalities play a major role in these developments and see my recent survey talk with key open questions given at the Simons Institute.

Another exciting consequence of this program is a satisfyingly complete picture of the complexity of refuting random constraint satisfaction problems (CSPs) like 3SAT. This involves obtaining tight thresholds for Sum-of-Squares method for refuting random CSPs, new spectral algorithms with surprisingly simple analyses that extend to semi-random instances and settle Feige's Conjecture on the existence of even covers in worst-case hypergraphs. Building on the algorithms from this line of work yields attacks on a couple of proposed constructions of cryptographic pseudorandom generators.

This research program also has also found rigorous evidence for average-case algorithmic thresholds via sharp sum-of-squares lower bounds for average-case problems. One highlight of this work was the discovery of Pseudo-calibration that led to tight SoS thresholds for the Planted Clique problem and uncovered relationships of the SoS algorithm with spectral methods and the low-degree polynomial method.

See Selected Papers below for videos to recorded talks on all the mentioned works. See a complete list of my papers for more details.

Advising

Postdoc: Alperen Erg├╝r (Now Assistant Professor of Math and CS at UT San Antonio), Mitali Bafna (starting Fall 2022)
PhD Students: Ainesh Bakshi (co-advised with David Woodruff), Tim Hsieh, Xinyu Wu (co-advised with Ryan O'Donnell, winner of the MSR Ada Lovelace Fellowship), Jeff Xu
Undergraduates: Misha Ivkov (winner of the Mark Stehlik SCS Alumni Undergraduate Impact Scholarship, now PhD student at Stanford), Zachary Sussman (winner of the Alan Newell Award for best undergraduate SCS Thesis)

Some Representative Papers (for all papers, see here)

Learning Mixtures of Gaussians

Complexity of Refuting Random/Semi-random CSPs

Algorithmic Robust Statistics

Complexity of Unique Games and Related Problems

Average-Case Algorithmic Thresholds

Applications

Service

PC Member APPROX/RANDOM 2018, SODA 2019, STOC 2020, ITCS 2020 , NeurIPS Area Chair 2020,2021, CCC 2022, RANDOM 2022, FSTTCS 2022

Mentoring Talks

I recently gave two mentoring talks as part of the new workshops organized by Learning Theory Alliance.
Slides from talk on Interacting with your Research Community.
Slides from talk on Thoughts on PhD Admissions.