Machine Learning/Statistics Reading Group Talk

  • Assistant Professor
  • Stern School of Business
  • New York University

Quantile Regression for big data with small memory

In this talk, we discuss the inference problem of quantile regression for a large sample size n but under a limited memory constraint, where the memory can only store a small batch of data of size m. A popular approach, the naive divide-and-conquer method, only works when n=o(m^2) and is computationally expensive. This talk proposes a novel inference approach and establishes the asymptotic normality result that achieves the same efficiency as the quantile regression estimator computed on all the data. Essentially, our method can allow arbitrarily large sample size n as compared to the memory size m. Our method can also be applied to address the quantile regression under distributed computing environment (e.g., in a large-scale sensor network) or for real-time streaming data. This is a joint work with Weidong Liu and Yichen Zhang.

Xi Chen is an assistant professor at Stern School of Business at New York University. Before that, he was a Postdoc in the group of Prof. Michael Jordan at UC Berkeley. He obtained his Ph.D. from the Machine Learning Department at Carnegie Mellon University.

He studies high-dimensional statistics, multi-armed bandits, and stochastic optimization. He received Simons-Berkeley Research Fellowship, Google Faculty Award, Adobe Data Science Award, Bloomberg research award, and was featured in 2017 Forbes list of “30 Under30 in Science”. 

Faculty Host: Aarti Singh

For More Information, Please Contact: