I am a PhD student in computer science at Carnegie Mellon University advised by Nina Balcan and Ameet Talwalkar. My research focuses on foundations and applications of machine learning, in particular the theoretical and practical understanding of meta-learning and automation. I am a recipient of the Facebook PhD Fellowship and have spent time as an intern at Microsoft Research - New England. Previously, I worked on fusion plasma simulations at the Lawrence Livermore National Lab and received an AB in Mathematics and an MSE in Computer Science from Princeton University.
Rethinking Neural Operations for Diverse Tasks.
Initialization and Regularization of Factorized Neural Layers. To Appear in ICLR 2021.
Geometry-Aware Gradient Algorithms for Neural Architecture Search. To Appear in ICLR 2021.
A Sample Complexity Separation between Non-Convex and Convex Meta-Learning. ICML 2020.
Differentially Private Meta-Learning. ICLR 2020.
Adaptive Gradient-Based Meta-Learning Methods. NeurIPS 2019.
A Theoretical Analysis of Contrastive Unsupervised Representation Learning. ICML 2019.
Provable Guarantees for Gradient-Based Meta-Learning. ICML 2019.