A Zero-One Result for the Least Squares Estimator
The least squares estimator for the linear regression model is shown to converge to the true parameter vector either with probability one or with probability zero under weak conditions on the dependent random variable and regressor variables. No additional conditions are placed on the errors. The dependent and regressor variables are assumed to be weakly dependent — in particular, to be strong mixing. The regressors may be ﬁxed or random and must exhibit a certain degree of independent variability. No further assumptions are needed. The model considered allows the number of regressors to increase without bound as the sample size increases. The proof proceeds by extending Kolmogorov’s 0-1 law for independent random variables to strong mixing random variables.
Andrews, Donald W.K., "A Zero-One Result for the Least Squares Estimator" (1984). Cowles Foundation Discussion Papers. 931.