AMAZON: Presentation

  • Gates Hillman Centers
  • Rashid Auditorium 4401
Career Presentation

Research at Amazon

In this talk I will give a sample of some of the research done at AWS. In particular I will talk about some recent results in Reinforcement Learning using a combined on-policy and off-policy approach to obtain rapidly converging and sample efficient algorithms. The key idea in this work is to use propensity scoring and effective sample size reweighting to obtain an optimization algorithm that converges rapidly and that takes advantage of a large replay buffer.

Secondly, I will discuss some recent results in multivariate time series prediction. By applying de Finetti's theorem and causal ordering we are able to obtain a necessary and sufficient  characterization of sequence data that consists of a local and global decomposition. The resulting model is computationally efficient and has a high degree of accuracy. Lastly, I will give an update on the D2L.ai project which aims to bring deep learning education to every scientists and engineer.

Alex studied physics at the University of Technology in Munich. After a PhD in computer science at the University of Technology in Berlin in 1998 he worked as researcher at the IDA Group of the GMD. He joined the Australian National University in Canberra in 2000 and NICTA in 2004 where he served as group leader and professor until 2008. From 2008 until 2012 he worked at Yahoo Research and subsequently from 2012 until 2014 at Google Research. He has served as adjunct professor at UC Berkeley in 2011-12 and 2019. In addition to that he was full professor at CMU in 2013-17. After founding Marianas Labs in 2015 he moved to Amazon Web Services in 2016 where he currently serves as VP and Distinguished Scientist. Alex has published over 200 papers and 5 books. His research interests cover kernel methods, Bayesian nonparametrics, large scale inference and deep learning.

At Amazon Web Services he helps to build AI and Machine Learning tools for everyone.

For More Information, Please Contact: