Hongyang Zhang 张弘扬
I am a fourth-year Ph.D. student at Carnegie Mellon University, co-advised by Maria-Florina Balcan and David P. Woodruff.
Before joining CMU, I graduated from Peking University in 2015. I was fortunate to work with Zhouchen Lin and Chao Zhang.
My research interests broadly include theories and applications of machine learning and algorithms, such as adversarial defence and attack, non-convex/convex optimization, deep learning, low-rank subspace recovery, noise-tolerant active learning, property testing, compressed sensing, etc.
Machine Learning, Optimization, and their Applications. Current research focus includes:
Building theoretical foundations for computationally efficient adversarial defence and attack. Developing practical, large-scaled algorithms for real-world AI security problems.
Developing new paradigms toward global optimality of non-convex optimization in polynomial time. Designing algorithms and understanding landscape (e.g., duality gap) of deep neural network, GAN, matrix factorization, etc.
Designing principled, practical and sub-linear algorithms for big data problems with near-optimal sample complexity. These include models of matrix completion and sensing, robust PCA, margin-based active learning, property testing, phase retrieval, etc.
Applications of machine learning models in image and video processing, medical data, etc.
2018/11/8. In NIPS 2018 Adversarial Vision Challenge, our team won the 1st place in Robust Model Track and Targeted Attacks Track, and the 3rd place in Untargeted Attacks Track [News]. Please look forward to our new theory-inspired approaches which will appear soon!
2018/9/27. One paper was accepted to SODA 2019.
2018/6/27. One paper was accepted to Proceedings of the IEEE.
2018/9/4-2018/10/9. I was visiting Simons Institute at UC Berkeley.
2018/5/29-2018/8/24. I was doing an internship at Petuum, Inc.
2018/4/16. One paper was accepted to ICALP 2018.
2017/10/26. One paper was accepted to ITCS 2018.
2017/9/4. Two papers were accepted to NIPS 2017.
2017/6/15-2017/8/15. I was visiting IBM theory group, San Jose.
2017/5/12. One paper was accepted to ICML 2017.
2017/5/8. My new book has been published by Elsevier Press. [Elsevier link] [Amazon link]
2016/8/12. One paper was accepted to NIPS 2016.
2016/5/3. One paper was accepted to IEEE Transactions on Information Theory.
2016/4/26. One paper was accepted to COLT 2016.
With Maria-Florina Balcan, Yi Li, David P. Woodruff (α-β order). "Testing Matrix Rank, Optimally", SODA 2019, San Diego, USA. [pdf] [arXiv]
Hongyang Zhang, Susu Xu, Jiantao Jiao, Pengtao Xie, Ruslan Salakhutdinov, Eric P. Xing. "Stackelberg GAN: Towards Provable Minimax Equilibrium via Multi-Generator Architectures", 2018. [arXiv]
Hongyang Zhang, Junru Shao, Ruslan Salakhutdinov. "Deep Neural Networks with Multi-Branch Architectures Are Less Non-Convex", 2018. [arXiv]
With Maria-Florina Balcan, Yingyu Liang, David P. Woodruff (α-β order). "Matrix Completion and Related Problems via Strong Duality", ITCS 2018, Cambridge, USA. [arXiv]
With Pranjal Awasthi, Maria-Florina Balcan, Nika Haghtalab (α-β order). “Learning and 1-bit Compressed Sensing under Asymmetric Noise”, COLT 2016, New York, USA. [pdf]
BOOK: With Zhouchen Lin (α-β order). “Low Rank Models in Visual Analysis: Theories, Algorithms and Applications”, Elsevier Press, 2017. [Elsevier link] [Amazon link]
Table of Contents
Linear Models (Single Subspace Models, Multiple-Subspace Models, Theoretical Analysis)
Non-Linear Models (Kernel Methods, Laplacian and Hyper-Laplacian Methods, Locally Linear Representation, Transformation Invariant Clustering)
Optimization Algorithms (Convex Algorithms, Non-Convex Algorithms, Randomized Algorithms)
Representative Applications (Video Denoising, Background Modeling, Robust Alignment by Sparse and Low-Rank Decomposition, Transform Invariant Low-Rank Textures, Motion and Image Segmentation, Image Saliency Detection, Partial-Duplicate Image Search, Image Tag Completion and Refinement, Other Applications)
Conclusions (Low-Rank Models for Tensorial Data, Nonlinear Manifold Clustering, Randomized Algorithms)
Journal Refereeing: Journal of Machine Learning Research, Machine Learning, Proceedings of the IEEE, IEEE Journal of Selected Topics in Signal Processing, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Transactions on Signal Processing, IEEE Transactions on Cybernetics, IEEE Signal Processing Letters, IEEE Access, Neurocomputing.
Conference Refereeing: AAAI 2016, ICML 2016, NIPS 2016, IJCAI 2017 (PC member), STOC 2017, NIPS 2017 (PC member), AAAI 2018 (PC member), STOC 2018, ISIT 2018, ICML 2018 (PC member), COLT 2018, NIPS 2018 (PC member), Approx 2018, ACML 2018 (PC member), AISTATS 2019 (PC member), ITCS 2019.
Volunteer: ICML 2016.
Testing and Learning from Big Data, Optimally, CMU AI Lunch 2018. [slide]
New Paradigms and Global Optimality in Non-Convex Optimization, CMU Theory Lunch 2017. [slide] [video]
Active Learning of Linear Separators under Asymmetric Noise, invited by Asilomar 2017. [slide]
Noise-Tolerant Life-Long Matrix Completion via Adaptive Sampling, CMU Machine Learning Lunch 2016. [slide]
10-702/36-702 Statistical Machine Learning (at CMU, TA for Larry Wasserman): Spring 2018.
10-725/36-725 Convex Optimization (at CMU, TA for Pradeep Ravikumar and Aarti Singh): Fall 2017.
Image Processing (at PKU, TA for Chao Zhang): Spring 2014.
I like traveling and photography. Check here some of the photos that I took.