Machine Learning / Duolingo Seminar

  • Remote Access - Zoom
  • Virtual Presentation - ET
Seminars

Why we should prefer simple causal models

It is well-known that learning statistical associations from finite data requires regularization to avoid overfitting. In other words, regularization terms penalize too complex functions to lower the risk that the functions capture random noise in the data. However, in the limit of infinite sample size, one can still learn arbitrarily complex statistical relations. I argue that regularization is even recommended in the population limit if one is interested in a causal model rather than a statistical model. This is because regularization can also mitigate bias from hidden common causes. This can be seen for a simple linear and non-linear regression task, where I show a very explicit formal analogy between finite sample and confounding bias. My theoretical results suggest that learning causal relations in the presence of hidden common causes should use particularly simple models.

Reference Paper: D. Janzing: Causal regularization, NeurIPS 2019.

Bio sketch:
Education: -Diplom in Physics (University of Tübingen, Germany) in 1995
- PhD in Mathematics (University of Tübingen) in 1998,
- Habilitation (teaching permission) in Computer Science (Karlsruhe Institute of Technology, KIT, Germany) in 2006.

Employment:
- From 1998 to 2006 he worked on Quantum Information and Quantum Thermodynamics at KIT.
- In 2001 he started teaching and research on Causal Inference at KIT.
- In 2007 he joined Max Planck Institute of Biological Cybernetics, Tübingen, where he founded the group ‘Causal Inference’ together with Bernhard Schölkopf. In 2011, the group became affiliated with the newly founded MPI for Intelligent Systems.
- In 2018 he joined Amazon, where he leads a team of scientists working on causal inference.

DJ’s primary interest is foundations of causal inference (including also the relation to physics), particularly foundational questions raised by real-world problems.

Zoom Participation. See announcement.

For More Information, Please Contact: 
Keywords: