Nonnegative matrix factorization (NMF) is initially proposed for parts-of-whole interpretations of nonnegative factors. But its real power is for data clustering and combinatorial optimization. We show (1) NMF, using least square objective, is equivalent to K-means clustering; (2) NMF, using I-divergence objective, is equivalent to Probabilistic Latent Semantic Indexing. We extend NMF to Convex NMF, Kernel NMF, and Semi-NMF. Most recently, we derive NMF-inspired algorithms to solve NP-hard combinatorial optimization problems: (a) multi-way normalized cut spectral clustering, (b) graph matching, and (c) maximum clique finding on both graphs and bipartite graphs. NMF algorithms are extremely simple to implement; their correctness and convergence can be rigorously proved using constrained optimization theory. These results also extend to tensor factorizations. We present extensive experiments to demonstrate the effectiveness of these algorithms.
Joint work with Tao Li and Michael Jordan.
Chris Ding earned a Ph.D. from Columbia University, worked at Caltech, Jet Propulsion Laboratory and Lawrence Berkeley National Laboratory before joining UT Arlington in 2007. His research focus on bioinformatics and machine mining, focusing on PCA and matrix based approaches.
Venue, Date, and Time
Venue: Newell Simon Hall 1507
Date: Monday, Feb 23, 2009
Time: 12:00 noon