Adaptive Rates for Support Vector Machines

Mona Eberts (Stuttgart)
Wednesday, June 25, 2014 - 10:00am
Mohrenstraße 39, Erhard-Schmidt-Hörsaal

Support vector machines (SVMs) using Gaussian kernels are one of the standard and state- of-the-art learning algorithms. For such SVMs applied to least squares regression we establish new oracle inequalities. With the help of these oracle inequalities, we derive learning rates that are (essentially) minmax optimal under standard smoothness assumptions on the target function. We further utilize the oracle inequalities to show that the achieved learning rates can be adaptively obtained by a simple data- dependent parameter selection method.

Furthermore, in order to reduce computational costs, we develop a localized SVM approach that is based upon a partition of the input space and trains an individual SVM on each cell of the partition. We apply this local SVM to least squares regression using Gaussian kernels and deduce local learning rates that are essentially minmax optimal under some standard smoothness assumptions on the regression function. This gives the first motivation for using local SVMs that is not based on computational requirements but on theoretical predictions on the generalization performance.