Higher-order Improvements of a Computationally Attractive k-Step Bootstrap for Extremum Estimators
This paper establishes the higher-order equivalence of the k -step bootstrap, introduced recently by Davidson and MacKinnon (1999a), and the standard bootstrap. The k -step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper also extends results of Hall and Horowitz (1996) to provide new results regarding the higher-order improvements of the standard bootstrap and the k -step bootstrap for extremum estimators (compared to procedures based on ﬁrst-order asymptotics). The results of the paper apply to Newton-Raphson (NR), default NR, line-search NR, and Gauss-Newton k -step bootstrap procedures. The results apply to the nonparametric iid bootstrap, non-overlapping and overlapping block bootstraps, and restricted and unrestricted parametric bootstraps. The results cover symmetric and equal-tailed two-sided t tests and conﬁdence intervals, one-sided t tests and conﬁdence intervals, Wald tests and conﬁdence regions, and J tests of over-identifying restrictions. The optimal block length for the accuracy of tests and conﬁdence intervals is shown to be proportional to N 1 /4 for both non-overlapping and overlapping block bootstraps in the context considered. In addition, the paper provides some results that establish the equivalence of the higher-order eﬀiciency of non-bootstrap k-step statistics and extremum statistics. These results extend results of Pfanzagl (1974), Robinson (1988), and others.
Andrews, Donald W.K., "Higher-order Improvements of a Computationally Attractive k-Step Bootstrap for Extremum Estimators" (1999). Cowles Foundation Discussion Papers. 1478.