My main area of interest revolves around developing machine learning models that can handle non-stationary data distributions, with a particular focus on natural language processing applications. Following this leitmotiv I have worked on distributional shift in the form of domain shift, adversarial perturbations and more recently continual learning of multiple tasks. I am actively looking for research internships so feel free to get in touch if you're looking for people to work on any of these topics.
I'm also an active contributor of DyNet, a toolkit for dynamic neural networks. Check it out!
Other than that I like reading sci-fi, sleeping, eating and playing video games.
Before studying at CMU, I was an "Élève ingènieur" at École polytechnique in France.
My email is
pmichel1[at]cs.cmu.edu. You can also find me on Twitter, where I mostly tweet about my own work.
- Our paper Are Sixteen Heads Really Better than One? just got accepted at NeurIPS 2019 as a poster! Code available on github
- The findings of our Machine Translation Robustness Shared Task are now online on arxiv.
- I gave a tutorial on MT at the JSALT 2019 summer school. The code is available on github.
Some projects I've been doing on the side (not so) recently: