Workshop on

Component Analysis Methods for Classification,

Clustering, Modeling and Estimation Problems in Computer Vision.

Kyoto in conjunction with ICCV-2009

 


Aim:

The aim of this workshop is to bring together an interdisciplinary group of researchers from computer vision, pattern recognition, and machine learning to present new component analysis techniques and to identify the opportunities and challenges in applying component analysis techniques to computer vision problems. The workshop will be a mixture of invited talks and talks of previously unpublished high-quality submitted papers.

 

Scope:

 

Linear and Multilinear methods (e.g. Principal Component Analysis, Independent Component Analysis, Tensor factorization, …) have been successfully applied to modeling, classification and clustering in numerous visual, graphics and signal processing tasks over the last four decades. Many learning/estimation problems in vision (e.g. active appearance models, structure from motion, spectral clustering) can be successfully solved using modifications of component analysis (CA) methods. CA techniques are especially appealing because many can be solved as generalized eigenvalue problems or alternated least squares procedures, for which there exist extremely efficiently and numerically stable algorithms. The main limitation of these approaches is that they are usually optimal to find only linear/multilinear structure in the data. However, in the late 90’s many researchers in the area of machine learning, neural networks and statistics were able to cast many non-linear problems for classification, clustering, visualization, dimensionality reduction or modeling as a spectral decomposition of a kernel matrix. These spectral approaches offer a potential for solving linear and non-linear estimation/learning problems in vision efficiently and without local minima.

 

The goal of this workshop is to discuss the state of the art of component analysis algorithms for estimation and learning in computer vision. Relevant topics of the workshop include (but are not limited):

 

Advances in standard CA techniques:

        Principal Component Analysis/Singular Value Decomposition, Linear Discriminant Analysis, Canonical Correlation Analysis, Independent Component Analysis, Partial least squares, Principal Component Regression, Correspondence Analysis, Redundancy Analysis, Functional Component Analysis, …

        Non-negative matrix factorization

        Spectral graph methods

        Kernel methods

        Tensor decomposition

        Multidimensional scaling

        Non-linear dimensionality reduction techniques (e.g. LLE, Isomap, …)

        Manifold learning

        Latent variable models

        Inverse eigenvalue problems

 

Applications of CA methods to computer vision problems:

       Appearance Models (active shape models, active appearance models, …)

       Segmentation with spectral graph methods (e.g. normalized Cuts)

       Factorization methods for rigid and non-rigid structure from motion

       Object and face recognition

       Camera calibration

       Robot localization

       Feature selection

       Low dimensional visualization

       Visual geometry problems (e.g. trifocal tensor, fundamental matrix estimation, multiview geometry)

 

Open research problems:

       The role of representation in CA methods. How to build representations invariant to geometric transformations?.

       How to learn an optimal representation for clustering, classification,…?.

       How do spectral problems (with different normalizations) relate to the error functions and least squares estimation problems?

       Generative versus discriminative learning with CA methods.

       Learning from high dimensional data and few training samples. How to build reliable estimates of the components that generalize well from few training samples?

       How to properly normalize CA methods (e.g. how to normalize kernel matrices) ?

       How do CA techniques compare to state of the art techniques for classification (SVM, adaboost), modeling (probabilistic graphical models), …

       Optimal selection of the number of components.

       Non-linear component analysis methods (beyond kernel methods). How to overcome local minima problems.

       Unified view of classification, clustering, dimensionality reduction and modeling algorithms.

       How to learn sparse feature representations.