Artificial Intelligence Seminar

  • Gates Hillman Centers
  • ASA Conference Room 6115
  • ADAMS WEI YU
  • Ph.D. Student
  • Machine Learning Department
  • Carnegie Mellon University
Seminars

Efficient and Effective Models for Machine Reading Comprehension

Machine reading comprehension has attracted lots of attentions in the AI, ML and NLP communities. In this talk, I will introduce two efficient and effective models to approach this task.

Firstly, I will propose a model, LSTM-Jump, that can skip unimportant information in sequential data, mimicking the skimming behavior of human reading. Trained with an efficient reinforcement learning algorithm, this model can be several times faster than a vanilla LSTM in inference time.

Then I will introduce a sequence encoding method that discards recurrent networks, which thus fully supports parallel training and inference. Based on this technique, a new question-answering model, QANet, is proposed. Combined with data augmentation approach via backtranslation, this model achieves No.1 performance in the competitive Stanford Question and Answer Dataset (SQuAD), while being times faster than the prevalent models. Notably, the exact match score of QANet has exceeded human performance.

The talk is based on the following two works:

  1.  http://aclweb.org/anthology/P17-1172
  2. https://arxiv.org/pdf/1804.09541.pdf

The AI Seminar is generously sponsored by Apple.

For More Information, Please Contact: 
Keywords: