Nonparametric instrumental variable estimation under monotonicity

Speaker(s): 
Denis Chetverikov (UCLA, USA)
Date: 
Wednesday, December 3, 2014 - 10:00am
Location: 
WIAS, Erhard-Schmidt-Saal, Mohrenstraße 39, 10117 Berlin

The ill-posedness of the inverse problem of recovering a regression function in a nonparametric instrumental variable model leads to estimators that may suffer from a very slow, logarithmic rate of convergence. In this paper, we show that restricting the problem to models with monotone regression functions and monotone instruments significantly weakens the ill-posedness of the problem. Under these two monotonicity assumptions, we establish that the constrained estimator that imposes monotonicity possesses the same asymptotic rate of convergence as the unconstrained estimator, but the finite-sample behavior of the constrained estimator (in terms of risk bounds) is much better than expected from the asymptotic rate of convergence when the regression function is not too steep. In the absence of the point-identifying assumption of completeness, we also derive non-trivial identification bounds on the regression function as implied by our two monotonicity assumptions. Finally, we provide a new adaptive test of the monotone instrument assumption and a simulation study that demonstrates significant finite-sample performance gains from imposing monotonicity.