FacebookTwitterGoogle PlusRSS News Feed

SCS Faculty Candidate

Talks
Ph.D. Candidate
Computer Science and Artificial Intelligence Laboratory
Massachusetts Institute of Technology
Lagrangian Relaxation for Natural Language Processing
Tuesday, April 1, 2014 - 1:00pm
3305 
Gates&Hillman Centers
Abstract:

Statistical methods have revolutionized natural language processing (NLP) and have led to advances in applications such as automatic translation, question answering, and structured search. Fueled by access to large amounts of textual data, researchers have developed increasingly detailed models of language to improve accuracy on these tasks. However, these improvements often come at the cost of vastly increasing the search-space of the underlying prediction problems. Solving these prediction problems optimally is very challenging, leading most large-scale NLP systems to rely heavily on heuristic search algorithms.

This talk presents methods for finding provably optimal predictions for im-portant models in natural language processing. I will focus on two core NLP tasks, syntactic parsing and machine translation. For both of these prob-lems I will show how Lagrangian relaxation, a classical method from combi-natorial optimization, can be used to decompose challenging prediction problems into problems that are solva-ble with efficient combinatorial algorithms. Experiments show that the methods find optimal solutions for the vast majority of examples, with comparable efficiency to commonly-used heuristic algorithms.

***

Alexander Rush is a Ph.D. candidate at MIT under the supervision of Professor Michael Collins and is currently a visiting scholar at Columbia University. He received a BA in computer science from Harvard University. He has won best paper awards at EMNLP 2010 and NAACL 2012. His research interests are in statistical natural language processing and structured prediction.

Faculty Host: Chris Dyer

Keywords:
For More Information, Please Contact:

maryjob [atsymbol] cs ~replace-with-a-dot~ cmu ~replace-with-a-dot~ edu