
Computer Science Department, CMU
Gates 7105, praveshk AT cs.cmu.edu
Upcoming Talks
EPFL Workshop on Learning: Optimization and Stochastics [Jun'22]
Dagstuhl Worshop on The Constraint Satisfaction Problem: Complexity and Approximability [May'22]
MIT Theory of Computation Colloquium [April 26, '22]
Harvard Theory Seminar [April 20, '22]
IAS CS/DM Seminar [Feb 28, '22]
"Three Lectures on High-Dimensional Statistical Estimation via Sum-of-Squares", ARC-ACO Lecture Series, Georgia Tech [15-19 Feb'22]
FOCS Workshop on Proof Complexity [7-8 Feb'22]
Research Highlights
My current research aims at designing efficient algorithms and obtaining rigorous evidence of algorithmic thresholds for average-case computational problems arising in theoretical computer science and its intersections with allied areas such as statistics. Part of this research program is currently supported by an NSF CAREER Award (2021-2026) on The Nature of Average-Case Computation and a Sloan Fellowship (2022). One highlight of this research program has been the development of sum-of-squares method that brings in ideas from proof complexity to the design of efficient algorithms. See my recent monograph Semialgebraic Proofs and Efficient Algorithm Design with Pitassi and Fleming for an introduction and lecture notes from my Fall 2021 CMU class for details.
An exciting recent success of this research program is improved efficient algorithms for learning mixtures of high-dimensional Gaussians in a sequence of papers beginning in 2018 that resolves an open question that Santosh Vempala asked in the wake of the 2010 breakthrough on learning Gaussian mixtures from i.i.d. samples. The tools developed here (sum-of-squares certificates for analytic properties such as subgaussianity, anti-concentration and hypercontractivity) also led to algorithms for basic tasks in algorithmic robust statistics such as linear regression, moment estimation, perturbation resilient algorithms for Sparse PCA and a new blueprint for designing list-decodable learning algorithms that finally resolved the basic question of covariance estimation. This work is currently supported by a Google Research Scholar Award for Efficient Algorithms for Robust Machine Learning. For an overview of how sum-of-squares certificates of basic analytic inequalities play a major role in these developments and see my recent survey talk with key open questions given at the Simons Institute.
Another exciting consequence of this program is a satisfyingly complete picture of the complexity of refuting random constraint satisfaction problems (CSPs) like 3SAT. This involves obtaining tight thresholds for Sum-of-Squares method for refuting random CSPs, new spectral algorithms with surprisingly simple analyses that extend to semi-random instances and settle Feige's Conjecture on the existence of even covers in worst-case hypergraphs. Building on the algorithms from this line of work yields attacks on a couple of proposed constructions of cryptographic pseudorandom generators.
This research program also has also found rigorous evidence for average-case algorithmic thresholds via sharp sum-of-squares lower bounds for average-case problems. One highlight of this work was the discovery of Pseudo-calibration that led to tight SoS thresholds for the Planted Clique problem and uncovered relationships of the SoS algorithm with spectral methods and the low-degree polynomial method.
See Selected Papers below for videos to recorded talks on all the mentioned works. See a complete list of my papers for more details.
Advising
Postdoc: Alperen Ergür (Now Assistant Professor of Math and CS at UT San Antonio), Mitali Bafna (starting Fall 2022)
PhD Students: Ainesh Bakshi (co-advised with David Woodruff), Tim Hsieh, Xinyu Wu (co-advised with Ryan O'Donnell, winner of the MSR Ada Lovelace Fellowship), Jeff Xu
Undergraduates: Misha Ivkov (winner of the Mark Stehlik SCS Alumni Undergraduate Impact Scholarship, now PhD student at Stanford), Zachary Sussman (winner of the Alan Newell Award for best undergraduate SCS Thesis)
Some Representative Papers (for all papers, see here)
Learning Mixtures of Gaussians
- Robustly Learning a Mixture of $k$ Arbitrary Gaussians
With Ainesh Bakshi , Ilias Diakonikolas, He Jia, Daniel Kane, Santosh Vempala
STOC, 2022 (to appear). LONG TALK. - Outlier-Robust Clustering of Non-Spherical Mixtures
With Ainesh Bakshi
FOCS, 2020. LONG TALK.
Conference version to be merged with this paper. - Better Agnostic Clustering Via Relaxed Tensor Norms
With Jacob Steinhardt
STOC 2018 BOARD TALK with a simpler, complete proof!.
Conference version merged with this paper.
Complexity of Refuting Random/Semi-random CSPs
- Algorithms and Certificates for Boolean CSP Refutation: Smoothed is no Harder than Random
With Venkat Guruswami and Peter Manohar
STOC, 2022 (to appear). LONG TALK. -
Strongly refuting all semi-random Boolean CSPs
With Jackson Abascal and Venkat Guruswami .
SODA, 2021. LONG TALK. -
Sum-of-Squares Lower Bounds for Refuting any CSP
With Ryuhei Mori , Ryan O'Donnell and David Witmer
STOC, 2017.
Algorithmic Robust Statistics
- List-Decodable Covariance Estimation
With Misha Ivkov
STOC, 2022 (to appear).
- List-Decodable Subspace Recovery: Dimension Independent Error in Polynomial Time
With Ainesh Bakshi
SODA, 2021.
- List-Decodable Linear Regression
With Sushrut Karmalkar and Adam Klivans
NeuIPS Spotlight, 2019. LONG TALK. -
Sparse PCA: algorithms, adversarial perturbations and certificates
With Tommaso D'Orsi, Gleb Novikov, and David Steurer .
FOCS 2020. -
Efficient Algorithms for Outlier-Robust Regression
With Adam Klivans and Raghu Meka
COLT 2018
Prasad Raghavendra's Exposition of our algorithm in a Simons Institute Bootcamp - Outlier-Robust Moment Estimation Via Sum-of-Squares
With David Steurer
STOC 2018 2 hour BOARD TALK.
Conference version merged with this paper.
Complexity of Unique Games and Related Problems
- Playing Unique Games on Certified Small-Set Expanders
With Mitali Bafna , Boaz Barak , Tselil Schramm and David Steurer.
STOC, 2021.
- Quantum Entanglement, Sum-of-Squares and the Log-Rank Conjecture
With Boaz Barak and David Steurer
STOC, 2017
Recent 50 min talk and a shorter talk with a different perspective. - Small-Set Expansion in Shortcode Graph and the 2-to-1 Conjecture
With Boaz Barak and David Steurer
A generally accessible article on the recent proof of the 2-to-2 games conjecture that partly relies on this work.
Average-Case Algorithmic Thresholds
- Algorithmic Thresholds for Refuting Random Polynomial Systems
With Tim Hsieh.
SODA, 2022. - A Stress-Free Sum-of-Squares Lower Bound for Coloring
With Peter Manohar.
CCC, 2021.
- The power of sum-of-squares for detecting hidden structures
With Samuel B. Hopkins , Aaron Potechin , Prasad Raghavendra , Tselil Schramm and David Steurer
FOCS 2017
Invited to the Highlights of Algorithms (HALG) 2018 -
A Nearly Tight Sum-of-Squares Lower Bound for Planted Clique
With Boaz Barak , Sam Hopkins , Jon Kelner , Ankur Moitra and Aaron Potechin
FOCS 2016. [Video- IAS CS/DM Seminar] [Boaz's WOT post]
Invited to SICOMP Special Issue for FOCS 2016 -
SoS and Planted Clique: Tight Analysis of MPW Moments
at all Degrees and an Optimal Lower Bound at Degree Four
With Samuel B. Hopkins and Aaron Potechin
SODA 2016
Invited to the ACM Transactions on Algorithms, Special Issue for SODA 2016
Conference version merged with this paper. -
Approximating Rectangles by Juntas and a Weakly Exponential Lower Bound for LP Relaxations of CSPs
With Raghu Meka and Prasad Raghavendra
STOC, 2017
Invited to SICOMP Special Issue for STOC 2017
Recent 50 min talk.
Applications
- Sum-of-Squares Meets Nash: Lower Bounds for finding any equilibrium
With Ruta Mehta
STOC 2018 -
Limits on Low-Degree Pseudorandom Generators (Or: Sum-of-Squares Meets Program Obfuscation)
With Boaz Barak , Zvika Brakerski and Ilan Komargodski
EUROCRYPT 2018
Service
PC Member APPROX/RANDOM 2018,
SODA 2019, STOC 2020, ITCS 2020 , NeurIPS Area Chair 2020,2021, CCC 2022, RANDOM 2022, FSTTCS 2022
Mentoring Talks
I recently gave two mentoring talks as part of the new workshops organized by Learning Theory Alliance.
Slides from talk on Interacting with your Research Community.
Slides from talk on Thoughts on PhD Admissions.