Document Type

Discussion Paper

Publication Date

9-1-2003

CFDP Number

1437

CFDP Pages

47

Abstract

A new class of kernel estimates is proposed for long run variance (LRV) and heteroskedastic autocorrelation consistent (HAC) estimation. The kernels are called steep origin kernels and are related to a class of sharp origin kernels explored by the authors (2003) in other work. They are constructed by exponentiating a mother kernel (a conventional lag kernel that is smooth at the origin) and they can be used without truncation or bandwidth parameters. When the exponent is passed to infinity with the sample size, these kernels produce consistent LRV/HAC estimates. The new estimates are shown to have limit normal distributions, and formulae for the asymptotic bias and variance are derived. With steep origin kernel estimation, bandwidth selection is replaced by exponent selection and data-based selection is possible. Rules for exponent selection based on minimum mean squared error (MSE) criteria are developed. Optimal rates for steep origin kernels that are based on exponentiating quadratic kernels are shown to be faster than those based on exponentiating the Bartlett kernel, which produces the sharp origin kernel. It is further shown that, unlike conventional kernel estimation where an optimal choice of kernel is possible in terms of MSE criteria (Priestley, 1962; Andrews, 1991), steep origin kernels are asymptotically MSE equivalent, so that choice of mother kernel does not matter asymptotically. The approach is extended to spectral estimation at frequencies omega < 0. Some simulation evidence is reported detailing the finite sample performance of steep kernel methods in LRV/HAC estimation and robust regression testing in comparison with sharp kernel and conventional (truncated) kernel methods.

Included in

Economics Commons

Share

COinS