Bootstrap tuned model selection

Vladimir Spokoiny (WIAS Berlin/HU Berlin)
Wednesday, May 4, 2016 - 10:00am
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

In the problem of model selection for a given family of linear estimators \tilde{\theta}_m, m \in, ordered by their variance, we offer a new "smallest accepted" approach motivated by Lepski's device and the multiple testing idea. The procedure selects the smallest model which satisfies the acceptance rule based on comparison with all larger models. The method is completely data-driven and does not use any prior information about the variance structure of the noise: its parameters are adjusted to the underlying possibly heterogeneous noise by the so called "propagation condition" using wild bootstrap method. The validity of the bootstrap calibration is proved for finite samples with an explicit error bound. We provide a comprehensive theoretical study of the method and describe in details the set of possible values of the selected parameter \hat{m}. We also establish some precise oracle error bounds for the corresponding estimator \hat{\theta} = \tilde{\theta}_\hat{m} which equally applies to estimation of the whole parameter vectors, its subvector or linear mapping, as well as estimation of a linear functional.