Language Technologies Institute
School of Computer Science
Carnegie Mellon University
Office: GHC 5414
Email: ziruiw [at] cs (dot) cmu (dot) edu
Twitter: @MrZiruiWang
I am a PhD student at Language Technologies Institute in CMU, advised by Jaime Carbonell. Sadly, Jaime passed away in 2020 and I have been working with Yulia Tsvetkov and Emma Strubell since then. Jaime will always be my advisor and deeply missed.
Previously, I obtained my B.S. in Computer Science and Mathematics at CMU.
My research work are supported by Carnegie Mellon's Presidential Fellowship.
Obtain first super-human and SOTA performance (first above 90) on SuperGLUE dataset, as of Dec 20, 2020.
Transfer Learning, Meta Learning, Natural Language Processing, Computer Vision.
Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models.
Zirui Wang, Yulia Tsvetkov, Orhan Firat, Yuan Cao.
ICLR 2021 (Spotlight).
[arxiv]
On Negative Interference in Multilingual Models: Findings and A Meta-Learning Treatment.
Zirui Wang, Zachary C Lipton, Yulia Tsvetkov.
EMNLP 2020.
[arxiv]
[code]
Efficient Meta Lifelong-Learning with Limited Memory.
Zirui Wang*, Sanket Vaibhav Mehta*, Barnabás Póczos, Jaime Carbonell.
EMNLP 2020.
[arxiv]
[code]
Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework.
Zirui Wang*, Jiateng Xie*, Ruochen Xu, Yiming Yang, Graham Neubig, Jaime Carbonell.
ICLR 2020.
[arxiv]
[code]
[presentation]
Characterizing and Avoiding Negative Transfer.
Zirui Wang, Zihang Dai, Barnabás Póczos, Jaime Carbonell.
CVPR 2019.
[arxiv]
Towards more Reliable Transfer Learning.
Zirui Wang, Jaime Carbonell.
ECML-PKDD 2018.
[arxiv]
[sup]
[slides]
Student Researcher @ Google Brain May 2020 - Present
Host: Yuan Cao, Orhan Firat, Adams Yu
CMU Presidential Fellowship, 2020-2021.
CMU LTI Research Fellowship, 2017-present.
Teaching Assistant @ CMU