Language Technologies Institute Colloquium

  • Assistant Professor, and Peggy and Peter Preuss Faculty Scholar
  • Department of Computer Science and Engineering
  • University of California, San Diego

Machine Reading for Everyone

Machine reading tools such as question answering systems have the potential to accelerate tasks that involve synthesizing information buried in vast text collections.

Recent advances in training deep neural networks have produced  high performing machine reading models.  However, the current success of deep learning hinges upon having large quantities of labeled data to robustly estimate model parameters. For many languages,  little to no labeled data is available for this.

A key question is therefore: how can we develop machine reading methods whose performance on new languages is

not contingent upon availability of substantial amounts of labeled data?  In this talk I will present some of our ongoing work on learning representations that are invariant with respect to the shift in language, and can be used for cross-lingual machine reading.

Ndapa Nakashole is an Assistant Professor in CSE, University of California, San Diego. She works in Statistical Natural Language Processing, with an emphasis on machine reading,  and interest in bringing NLP to low resource languages.

Faculty Host: Yulia Tsvetkov

Refreshments service at 4:00 pm - 5th floor LTI kitchen area.

For More Information, Please Contact: