I am a research scientist at Google Brain.
I received my PhD at Language Technologies Institute in Carnegie Mellon University, advised by Jaime Carbonell. Sadly, Jaime passed away in 2020 and I had been working with Yulia Tsvetkov and Emma Strubell since then. Jaime will always be my advisor and deeply missed. Previously, I obtained my BS in Computer Science and Mathematics at CMU as well.
Deep Learning, Natural Language Processing, Computer Vision.
Scaling Autoregressive Models for Content-Rich Text-to-Image Generation.
Jiahui Yu, Yuanzhong Xu, Jing Yu Koh, Thang Luong, Gunjan Baid, Zirui Wang, Vijay Vasudevan, Alexander Ku, Yinfei Yang, Burcu Karagol Ayan, Ben Hutchinson, Wei Han, Zarana Parekh, Xin Li, Han Zhang, Jason Baldridge, Yonghui Wu.
CoCa: Contrastive Captioners are Image-Text Foundation Models.
Jiahui Yu*, Zirui Wang*, Vijay Vasudevan, Legg Yeung, Mojtaba Seyedhosseini, Yonghui Wu.
SimVLM: Simple Visual Language Model Pretraining with Weak Supervision.
Zirui Wang, Jiahui Yu, Adams Wei Yu, Zihang Dai, Yulia Tsvetkov, Yuan Cao.
ICLR 2022. [arxiv]
Towards Zero-Label Language Learning.
Zirui Wang, Adams Wei Yu, Orhan Firat, Yuan Cao.
Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models.
Zirui Wang, Yulia Tsvetkov, Orhan Firat, Yuan Cao.
ICLR 2021 (Spotlight). [arxiv]
Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework.
Zirui Wang*, Jiateng Xie*, Ruochen Xu, Yiming Yang, Graham Neubig, Jaime Carbonell.
ICLR 2020. [arxiv] [code] [presentation]
Characterizing and Avoiding Negative Transfer.
Zirui Wang, Zihang Dai, Barnabás Póczos, Jaime Carbonell.
CVPR 2019. [arxiv]
CMU Presidential Fellowship, 2020-2021.
CMU LTI Research Fellowship, 2017-2021.
Teaching Assistant @ CMU