I am also afflilated with the MPI for Biological Cybernetics as a research scientist.
KPC , Software to implement Nonlinear directed acyclic structure learning with weakly additive noise models (by Robert Tillman)
Consistent Nonparametric Tests of Independence , JMLR 2010
Hilbert Space Embeddings and Metrics on Probability Measures , JMLR 2010
Discussion of: Brownian distance covariance , Ann. App. Stat. 2009
Nonparametric Tree Graphical Models , AISTATS 2010
- Nonlinear directed acyclic structure learning
This algorithm learns the structure of a directed graphical model from data, combining a PC style search using nonparametric (kernel) measures of conditional dependence with local searches for additive noise models.
- Fast Kernel ICA
Kernel ICA uses kernel measures of statistical independence to separate linearly mixed sources. We have made this process much faster by using an approximate Newton-like method on the special orthogonal group to perform the optimisation.
- Covariate Shift Correction
Given sets of observations of training and test data, we reweight the training data such that its distribution more closely matches that of the test data. We achieve this goal by matching covariate distributions between training and test sets in a reproducing kernel Hilbert space.
- Kernel Two-Sample Test
A kernel method to perform a statistical test of whether two samples are from different distributions. This test can be applied to high dimensional data, as well as to non-vectorial data such as graphs (i.e., wherever kernels provide a similarity measure).
- Kernel Independence Test
A statistical test of whether two random variables are independent. As with the two-sample test above, the independence test relies on kernels, and can be used for high dimensional and non-vectorial data (e.g. strings).