I am a PhD student in computer science at Carnegie Mellon University advised by Nina Balcan and Ameet Talwalkar. My research focuses on foundations and applications of machine learning, in particular the theoretical and practical understanding of meta-learning and automation. I am a recipient of the Facebook PhD Fellowship and have spent time as an intern at Microsoft Research - New England, the Lawrence Livermore National Lab, and the Princeton Plasma Physics Lab. Previously, I received an AB in Mathematics and an MSE in Computer Science from Princeton University.
Learning Predictions for Algorithms with Predictions.
Mikhail Khodak, Maria-Florina Balcan, Ameet Talwalkar, Sergei Vassilvitskii.
Efficient Architecture Search for Diverse Tasks.
Geometry-Aware Gradient Algorithms for Neural Architecture Search. ICLR 2021.
Adaptive Gradient-Based Meta-Learning Methods. NeurIPS 2019.
A Theoretical Analysis of Contrastive Unsupervised Representation Learning. ICML 2019.
A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors. ACL 2018.