\( \newcommand{\pmset}{ \{\pm 1\}} \newcommand{\on}{\{\pm 1\}} \newcommand{\cM}{\mathcal{M}} \newcommand{\R}{\mathbb{R}} \)

Pravesh K. Kothari

picture
picture
Assistant Professor (Sep 2019-)
Computer Science Department, CMU
Gates 7105, praveshk AT cs.cmu.edu

Upcoming Talks/Events

Tel Aviv University Theory Fest, Plenary Session [Dec 22]
Workshop on Algorithms Under Uncertainty, Chennai [Dec 22]
Simons Institute Reunion Workshop on CCSI [Dec 22]
Princeton Center for Theoretical Science Workshop on Positivity [Nov 22]
CRM Workshop on Tensors in Algebraic Geometry [Nov'22]
Professional Mentoring Panel at FOCS 2022 [Nov'22]
UT Theory Seminar [Oct' 22]
ACO Seminar UC Irvine [Oct'22]
Berkeley Theory Lunch [Oct'22]
Stanford Theory Lunch [Oct'22]
Santa Fe Working Group on Foundations of Machine Learning [Jul'22]
Fields Workshop on Differential Privacy [Jul'22]
EPFL Workshop on Learning: Optimization and Stochastics [Jun'22]
Dagstuhl Worshop on The Constraint Satisfaction Problem: Complexity and Approximability [May'22]
MIT Theory of Computation Colloquium [April 26, '22]
Harvard Theory Seminar [April 20, '22]
IAS CS/DM Seminar [Feb 28, '22]
"Three Lectures on High-Dimensional Statistical Estimation via Sum-of-Squares", ARC-ACO Lecture Series, Georgia Tech [15-19 Feb'22]
FOCS Workshop on Proof Complexity [7-8 Feb'22]

Research Highlights

See a complete list of my papers for details.
See Selected Papers arranged roughly by topics below with links to videos of recorded talks.

My current research aims at designing efficient algorithms and obtaining rigorous evidence of algorithmic thresholds for average-case computational problems arising in theoretical computer science and its intersections with allied areas such as statistics. Part of this research program is currently supported by an NSF CAREER Award (2021-2026) on The Nature of Average-Case Computation and a Sloan Fellowship (2022). One highlight of this research program has been the development of sum-of-squares method that brings in ideas from proof complexity to the design of efficient algorithms. See my recent monograph Semialgebraic Proofs and Efficient Algorithm Design with Pitassi and Fleming for an introduction and lecture notes from my Fall 2021 CMU class for details.

An exciting recent success of this research program is improved efficient algorithms for learning mixtures of high-dimensional Gaussians in a sequence of papers beginning in 2018 that resolves an open question that Santosh Vempala asked in the wake of the 2010 breakthrough on learning Gaussian mixtures from i.i.d. samples. The tools developed here (sum-of-squares certificates for analytic properties such as subgaussianity, anti-concentration and hypercontractivity) also led to algorithms for basic tasks in algorithmic robust statistics such as linear regression, moment estimation, perturbation resilient algorithms for Sparse PCA and a new blueprint for designing list-decodable learning algorithms that finally resolved the basic question of covariance estimation. This work is currently supported by a Google Research Scholar Award for Efficient Algorithms for Robust Machine Learning. For an overview of how sum-of-squares certificates of basic analytic inequalities play a major role in these developments and see my recent survey talk with key open questions given at the Simons Institute.

Another exciting consequence of this program is a satisfyingly complete picture of the complexity of refuting random constraint satisfaction problems (CSPs) like 3SAT. This involves obtaining tight thresholds for Sum-of-Squares method for refuting random CSPs, new spectral algorithms with surprisingly simple analyses that extend to semi-random instances and settle Feige's Conjecture on the existence of even covers in worst-case hypergraphs and yield a near cubic lower bound for 3-query locally decodable codes. Building on the algorithms from this line of work yields attacks on a couple of proposed constructions of cryptographic pseudorandom generators.

This research program also has also found rigorous evidence for average-case algorithmic thresholds via sharp sum-of-squares lower bounds for average-case problems. One highlight of this work was the discovery of Pseudo-calibration that led to tight SoS thresholds for the Planted Clique problem and uncovered relationships of the SoS algorithm with spectral methods and the low-degree polynomial method.

Advising

Postdoc: Alperen Erg├╝r (Fall 2019-20, Now Assistant Professor of Math and CS at UT San Antonio),
Mitali Bafna (Fall 2022-23)
PhD Students:
Ainesh Bakshi (co-advised with David Woodruff, graduated 2022, now postdoc at MIT Math)
Tim Hsieh,
Peter Manohar (co-advised with Venkat Guruswami),
Noah Singer (co-advised with Aayush Jain),
Xinyu Wu (co-advised with Ryan O'Donnell, winner of the MSR Ada Lovelace Fellowship),
Jeff Xu (winner of the 2022 Cylab Presidential Fellowship)
Undergraduates: Prashanti Anderson (2021-22),
Misha Ivkov (2020-21) (winner of the Mark Stehlik SCS Alumni Undergraduate Impact Scholarship, now PhD student at Stanford),
Zachary Sussman (2019-20) (winner of the Alan Newell Award for best undergraduate SCS Thesis)

Some Representative Papers (for all papers, see here)

Algorithms for Learning Mixtures of Gaussians

Algorithms and Thresholds for Refuting Random/Semi-random CSPs and Applications

Algorithmic Robust Statistics

Algorithms and Complexity of Unique Games and Related Problems

Average-Case Algorithmic Thresholds

Applications

Funding

My research is generously supported by an NSF CAREER Award (2021), a Google Research Scholar Award (2021), a Alfred P. Sloan Fellowship (2022), and an NSF Medium Grant for collaborative research with Venkat Guruswami (2022).

Service

PC Member APPROX/RANDOM 2018, SODA 2019, STOC 2020, ITCS 2020 , NeurIPS Area Chair 2020,2021, CCC 2022, RANDOM 2022, FSTTCS 2022, FOCS 2023

Mentoring Talks

I recently gave two mentoring talks as part of the new workshops organized by Learning Theory Alliance
Slides from talk on Interacting with your Research Community.
Slides from talk on Thoughts on PhD Admissions.