====================================================================
                          CALL FOR PAPERS

          Sparse Representation and Low-rank Approximation
              NIPS 2011 Workshop, Sierra Nevada, Spain    
                      December 16 or 17, 2011

      http://www.cs.berkeley.edu/~ameet/sparse-low-rank-nips11

            Submission Deadline: Friday, October 14, 2011
====================================================================
Overview

Sparse representation and low-rank approximation are fundamental tools in fields as diverse as computer vision, computational biology, signal processing, natural language processing, and machine learning. Recent advances in sparse and low-rank modeling have led to increasingly concise descriptions of high dimensional data, together with algorithms of provable performance and bounded complexity. Our workshop aims to survey recent work on sparsity and low-rank approximation and to provide a forum for open discussion of the key questions concerning these dimensionality reduction techniques. The workshop will be divided into two segments, a "sparsity segment" emphasizing sparse dictionary learning and a "low-rank segment" emphasizing scalability and large data. Paralleling the two segments are two paper tracks.

The sparsity track encourages submissions exploring various aspects of learning sparse latent representations and dictionaries, in the form of new algorithms, theoretical advances and/or empirical results. Some specific areas of interest include structured matrix factorization algorithms, Bayesian models for latent variable representations, analysis of random dictionaries vs. learned dictionaries, novel applications of dictionary learning or relationships to compressed sensing.

The low-rank track encourages submissions exploring the impact of low-rank methods for large-scale machine learning in the form of new algorithms, theoretical advances and/or empirical results. Some specific areas of interest include randomized low-rank approximation techniques, performance of various low-rank methods for large-scale tasks and the tradeoff between numerical precision and time/space efficiency in the context of machine learning performance, e.g., classification or clustering accuracy. We also welcome work on related topics that motivate additional interesting scenarios for use of low-rank approximations for learning tasks.

Submission Guidelines

Submissions should be written as extended abstracts, no longer than 4 pages in the NIPS latex style. Style files and formatting instructions can be found at http://nips.cc/PaperInformation/StyleFiles. Submissions must be in PDF format. Authors' names and affiliations should be included, as the review process will not be double blind. The extended abstract may be accompanied by an unlimited appendix and other supplementary material, with the understanding that anything beyond 4 pages may be ignored by reviewers.

Please send your PDF submission by email to SLR.nips11@gmail.com by 5:00 pm PDT on Friday, October 14. Your email must also specify the paper track ('sparsity' or 'low-rank') for which your paper should be considered. Notifications will be given on or before November 4. Topics that are pending review, were recently published or were presented elsewhere are allowed, provided that the extended abstract mentions this explicitly. Finally, note that there will be no official proceedings from this workshop.

Invited Speakers

  • Inderjit Dhillon (University of Texas at Austin)
  • Rodolphe Jenatton (INRIA)
  • Yi Ma (University of Illinois at Urbana-Champaign)
  • Gabriel Peyré (CNRS)
  • Martin Wainwright (University of California at Berkeley)
  • David Wipf (University of California at San Diego)



    Organizers

    Francis Bach (INRIA), Michael Davies (Edinburgh), Rémi Gribonval (INRIA), Lester Mackey (Berkeley), Michael Mahoney (Stanford), Mehryar Mohri (NYU, Google Research), Guillaume Obozinski (INRIA), Ameet Talwalkar (Berkeley)