Tuesday, Nov 03, 2020. 12:00 noon - 01:00 PM ETLink to Zoom for Online Seminar.

Back to Seminar Schedule

Aaron Courville -- Emerging and preserving compositional structure through iterated learning.

Abstract: Iterated learning is a theory of how the compositional structure of human language emerged. The theory holds that intergenerational language transfer creates learning bottlenecks that privilege compositional structure. Recent work in the machine learning community has shown that the iterated learning mechanism can also promote compositional structure in language emergence in communication between neural-network-base AI-agents. In this talk, I will describe our recent efforts at putting iterated learning to work in applications of Neural Module Network to simple Visual Question Answering and in self-play training scenarios with dialogue models. We find that applying iterated learning to the generation of the program that specifies the assembly of distinct neural network modules leads to higher accuracy in program prediction and supports systematic generalization to testing question templates that are not in the training set. In the context of self-play training of dialogue agents, we find the surprising result that iterated learning can mitigate language drift.

Bio: Aaron Courville is an Associate Professor in the Department of Computer Science and Operations Research at the Universite de Montreal. He received his PhD from the Robotics Institute, Carnegie Mellon University. He is one of the early contributors to Deep Learning, and is a founding member of Mila and a fellow of the CIFAR program on Learning in Machines and Brains. Together with Ian Goodfellow and Yoshua Bengio, he co-wrote the seminal textbook on Deep Learning. His current research interests focus on the development of DL models and methods. He is particularly interested in deep generative model and multimodal ML with applications such as computer vision and natural language processing. Aaron holds a CIFAR Canadian AI chair and his research is supported in part by Microsoft Research, Samsung, Hitachi and a Google Focussed Research Award.