Artificial Intelligence Seminar

  • Remote Access Enabled - Zoom
  • Virtual Presentation
  • ANIMASHREE ANANDKUMAR
  • Bren Professor, Department of Computing + Mathematical Sciences
  • Director of Machine Learning Research, NVIDIA
  • California Institute of Technology
Seminars

Bridging the gap between artificial and human intelligence: Feedback and Compositionality

Deep learning has yielded impressive performance over the last few years. However, it is no match to human perception and reasoning. Recurrent feedback in the human brain is shown to be critical for robust perception, and is able to correct the potential errors using an internal generative model of the world. Inspired by this, we augment any existing neural network with feedback (NN-F) in a Bayes-consistent manner.   We demonstrate inherent robustness in NN-F that is far superior to standard neural networks.

Compositionality is another important hallmark of human intelligence. Humans are able to compose concepts to reason about entirely new scenarios.  We have created a new dataset for few-shot learning, inspired by the Bongard challenge. We show that all existing meta learning methods severely fall short of human performance. We argue that neuro-symbolic reasoning is critical for tackling such few-shot learning challenges and showcase some success stories.

Anima Anandkumar is a Bren Professor at Caltech and Director of ML Research at NVIDIA. She was previously a Principal Scientist at Amazon Web Services. She has received several honors such as Alfred. P. Sloan Fellowship, NSF Career Award, Young investigator awards from DoD, and Faculty Fellowships from Microsoft, Google, Facebook, and Adobe. She is part of the World Economic Forum's Expert Network. She is passionate about designing principled AI algorithms and applying them in interdisciplinary applications. Her research focus is on unsupervised AI, optimization, and tensor methods.

The AI Seminar is generously sponsored by Fortive

Zoom Participation. See announcement.

For More Information, Please Contact: