I am a second year PhD student in Machine Learning Department at Carnegie Mellon University and my advisors are Aarti Singh and Barnabás Póczos.
Previously, I studied EECS and EMS at UC Berkeley and I worked with Ming Gu, Lei Li, Michael Mahoney and Stuart Russell on various matrix-related problems. I also spent a semester at Department of Electronic Engineering at Tsinghua University.
My research interests broadly include topics in theoretical machine learning and statistics, such as matrix factorization, convex/non-convex optimization, transfer learning, reinforcement learning, non-parametric statistics and robust statistics. On the application side, I am interested in applying machine learning techniques for precision agriculture.
I was born in Sydney and grew up in Beijing. I had my wonderful 6 years at SDSZ.
- Carnegie Mellon University Aug. 2015 - present
PhD student in Machine Learning, School of Computer Science
- University of California, Berkeley Aug. 2011 - May. 2015
B.S. in Electrical Engineering and Computer Science
B.S. in Engineering Mathematics and Statistics
- Tsinghua Univeristy Feb. 2013 - Jun. 2013
Exchange student at Electronic Engineering Department.
- Research Intern at Facebook AI Research (FAIR), Menlo Park, CA, May. 2017 - Aug. 2017
Mentor: Yuandong Tian
- Research Intern at Microsoft Research (MSR), Redmond, WA, May. 2016 - Aug. 2016
Mentors: Jianshu Chen, Lihong Li, Lin Xiao and Dengyong Zhou
- Consulting Intern at Accenture, Beijing, China, May. 2015 - Jul. 2015
- Software Engineering Intern at Google, Irvine, CA, May. 2014 - Aug. 2014
- Consulting Intern at CCID Consulting, Beijing, China, May. 2012 - Jun. 2012
- Gradient Descent Can Take Exponential Time to Escape Saddle Points,
Simon S. Du, Chi Jin, Jason D. Lee, Michael I. Jordan, Barnabas Poczos, Aarti Singh,
- On the Power of Truncated SVD for General High-rank Matrix Estimation Problems,
Simon S. Du, Yining Wang, Aarti Singh,
- Hypothesis Transfer Learning via Transformation Functions,
Simon S. Du, Jayanth Koushik, Aarti Singh, Barnabás Póczos,
[PDF] [Arxiv] [Poster]
- Stochastic Variance Reduction Methods for Policy Evaluation,
Simon S. Du, Jianshu Chen, Lihong Li, Lin Xiao, Dengyong Zhou,
International Conference of Machine Learning (ICML) 2017.
[PDF] [Arxiv] [Lihong's Talk at Simons Institute]
- Computationally Efficient Robust Estimation of Sparse Functionals,
Simon S. Du, Sivaraman Balakrishnan, Aarti Singh,
Conference of Learning Theory (COLT) 2017.
Merged with this paper
- Efficient Nonparametric Smoothness Estimation,
Shashank Singh, Simon S. Du, Barnabás Póczos,
Conference on Neural Information Processing Systems (NIPS) 2016.
- An Improved Gap-Dependency Analysis of the Noisy Power Method,
Maria-Florina Balcan*, Simon S. Du*, Yining Wang*, Adams Wei Yu*,
Conference of Learning Theory (COLT) 2016.
[PDF] [Arxiv] [Slides] [Talk]
- Spectral Gap Error Bounds for Improving CUR Matrix Decomposition and the Nystrom Method,
David G. Anderson*, Simon S. Du*, Michael W. Mahoney*, Christopher Melgaard*, Kunming Wu*, Ming Gu,
International Conference on Artificial Intelligence and Statistics (AISTATS) 2015.
[PDF] [Supplement] [Code]
- Novel Quantization Strategies for Linear Prediction with Guarantees,
Simon S. Du**, Yichong Xu**, Yuan Li, Hongyang Zhang, Aarti Singh, Pulkit Grover,
International Conference of Machine Learning (ICML) 2016, On Device Intelligence (ONDI) workshop.
- Maxios: Large Scale Nonnegative Matrix Factorization for Collaborative Filtering,
Simon S. Du, Yilin Liu, Boyi Chen, Lei Li,
Conference on Neural Information Processing Systems (NIPS) 2014, workshop on Distributed Machine Learning and Matrix Computations.
**: Equal contribution. *: Alphabetic order according to mathematics or theoretical computer science convention.