This paper studies the eﬀects of spurious detrending in regression. The asymptotic behavior of traditional least squares estimators and tests are examined in the context of models where the generating mechanism is systematically misspeciﬁed by the presence of deterministic time trends. Most previous work on the subject has relied upon Monte Carlo studies to understand the issues involved in detrending data that is generated by integrated processes and our analytical results help to shed light on many of the simulation ﬁndings. Standard F tests and Hausman tests are shown to inadequately discriminate between the competing hypotheses. Durbin-Watson statistics, on the other hand, are shown to be valuable measures of series stationarity. The asymptotic properties of regressions and excess volatility tests with detrended integrated time series are also explored.
Durlauf, Steven N. and Phillips, Peter C.B., "Trends versus Random Walks in Time Series Analysis" (1986). Cowles Foundation Discussion Papers. 1031.