I am a PhD student in computer science at Carnegie Mellon University advised by Nina Balcan and Ameet Talwalkar. My research focuses on foundations and applications of machine learning, in particular the theoretical and practical understanding of meta-learning and automation. I am a recipient of the Facebook PhD Fellowship and have spent time as an intern at Google Research - New York, Microsoft Research - New England, the Lawrence Livermore National Lab, and the Princeton Plasma Physics Lab. Previously, I received an AB in Mathematics and an MSE in Computer Science from Princeton University.
Private Algorithms with Private Predictions.
Kareem Amin, Travis Dick, Mikhail Khodak, Sergei Vassilvitskii.
Meta-Learning in Games.
Keegan Harris*, Ioannis Anagnostides*, Gabriele Farina, Mikhail Khodak, Zhiwei Steven Wu, Tuomas Sandholm.
Meta-Learning Adversarial Bandits.
Maria-Florina Balcan, Keegan Harris, Mikhail Khodak, Zhiwei Steven Wu.
AANG: Automating Auxiliary Learning.
Lucio M. Dery, Paul Michel, Mikhail Khodak, Graham Neubig, Ameet Talwalkar.
Provably Tuning the ElasticNet Across Instances. To appear in NeurIPS 2022.
Efficient Architecture Search for Diverse Tasks. To appear in NeurIPS 2022.
Learning Predictions for Algorithms with Predictions. To appear in NeurIPS 2022.
Mikhail Khodak, Maria-Florina Balcan, Ameet Talwalkar, Sergei Vassilvitskii.
NAS-Bench-360: Benchmarking Neural Architecture Search on Diverse Tasks. To appear in NeurIPS 2022 (Datasets and Benchmarks Track).
Geometry-Aware Gradient Algorithms for Neural Architecture Search. ICLR 2021.
Adaptive Gradient-Based Meta-Learning Methods. NeurIPS 2019.
A Theoretical Analysis of Contrastive Unsupervised Representation Learning. ICML 2019.
A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors. ACL 2018.