On the statistical properties of $\ell_p$-Norm multiple kernel learning

Speaker(s): 
Marius Kloft (HU Berlin)
Date: 
Wednesday, May 6, 2015 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

Reproducing kernel Hilbert space methods have become a popular and versatile tool with many application areas in statistics and machine learning, the flagship method being the support vector machine. Nevertheless, a displeasing stumbling block towards the complete automatization of this method remains that of automatic kernel selection. In the seminal work of Lanckriet et al. (2004), it was shown that it is computationally feasible to simultaneously learn a support vector machine and a linear combination of kernels; this approach is dubbed "multiple kernel learning". In this talk, we discuss a further extension of this methodology, using an $\ell_q$ ($q\geq1$) regularization of the kernel coefficients, which can be understood as enforcing a degree of soft sparsity. We present a statistical analysis of the performance of this method in terms of the convergence of its excess loss, based on precise bounds on its Rademacher complexity. We will also demonstrate the interest of this approach through applications to bioinformatics and image recognition.