Date of Award

Spring 2022

Document Type


Degree Name

Doctor of Philosophy (PhD)



First Advisor

Andrews, Donald


This dissertation presents three essays in inference for nonparametric regression models. The first chapter, “Bias-Aware Inference for Conditional Average Treatment Effect Functions,” proposes a new method to construct a confidence band for the conditional treatment average treatment effect (CATE), as a function of a continuous covariate in a randomized controlled trial. My confidence band is bias-aware, taking into account the maximum smoothing bias of the nonparametric estimators used to construct the confidence band. I provide a computationally simple procedure to obtain a bias-aware confidence band whose half-length at each evaluation point is asymptotically shortest uniformly over the domain of the CATE function. The optimality holds over a class of confidence bands that satisfy a set of natural restrictions on the bandwidths used to construct the confidence bands. Using a simulation design mimicking some features of the randomized controlled trial in Bryan et al. (2021), I show that our confidence band performs favorably in terms of the finite sample coverage and the length when compared to a confidence band based on debiased estimators. Additional Monte Carlo simulation results also support this finding. In the second chapter, “Inference in Regression Discontinuity Designs under Monotonicity" (with Soonwoo Kwon), we provide an inference procedure for the sharp regression discontinuity design (RDD) under monotonicity. Specifically, we consider the case where the true regression function is monotone with respect to (all or some of) the running variables and assumed to lie in a Lipschitz smoothness class. Such a monotonicity condition is natural in many empirical contexts, and the Lipschitz constant has an intuitive interpretation. We propose a minimax two-sided confidence interval (CI) and an adaptive one-sided CI. For the two-sided CI, the researcher is required to choose a bound on the first derivative for the regression function. This is the only tuning parameter, and the resulting CI has uniform coverage and obtains the minimax optimal length. The one-sided CI can be constructed to maintain coverage over all monotone functions, providing maximum credibility in terms of the choice of the Lipschitz constant. Moreover, the monotonicity makes it possible for the (excess) length of the CI to adapt to the true Lipschitz constant of the unknown regression function. In the third chapter, “Adaptive Inference in Multivariate Nonparametric Regression Models Under Monotonicity” (with Soonwoo Kwon), we consider the problem of adaptive inference on a regression function at a point under a multivariate nonparametric regression setting. The regression function belongs to a Holder class and is assumed to be monotone with respect to some or all of the arguments. We derive the minimax rate of convergence for confidence intervals (CIs) that adapt to the underlying smoothness, and provide an adaptive inference procedure that obtains this minimax rate. The procedure differs from that of Cai and Low (2004a), intended to yield shorter CIs under practically relevant specifications. The proposed method applies to general linear functionals of the regression function, and is shown to have favorable performance compared to existing adaptive procedures.