My main area of interest revolves around developing machine learning models that can handle non-stationary data distributions, with a particular focus on natural language processing applications. Following this leitmotiv I have worked on distributional shift in the form of domain shift, adversarial perturbations and more recently continual learning of multiple tasks.
I also used to be an active contributor of DyNet, a toolkit for dynamic neural networks. Check it out!
Other than that I like reading sci-fi, sleeping, eating and playing video games. Recently I've picked up miniature painting and watercolor painting.
Before studying at CMU, I was an "Élève ingènieur" at École polytechnique in France.
My email is
pmichel1[at]cs.cmu.edu. You can also find me on Twitter, where I mostly tweet about my own work.
- I gave a talk at the NLP with Friends online seminar on our ongoing work on parametric distributionally robust optimization. The talk was recorded and can be found on Youtube.
- New paper Weight Poisoning Attacks on Pre-trained Models accepted as a long paper at ACL 2020. The code is available on github.
- Our paper Are Sixteen Heads Really Better than One? just got accepted at NeurIPS 2019 as a poster! Code available on github
Some projects I've been doing on the side (not so) recently: