Date of Award

Spring 2022

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Statistics and Data Science

First Advisor

Fan, Zhou

Abstract

This dissertation aims to highlight the importance of methodological development and the need for tailored algorithms in non-convex statistical problems. Specifically, we study three non-convex estimation problems with novel ideas and techniques in both statistical methodologies and algorithmic designs. Chapter 2 discusses my work with Zhou Fan on estimation of a piecewise-constant image, or a gradient-sparse signal on a general graph, from noisy linear measurements. In this work, we propose and study an iterative algorithm to minimize a penalized least-squares objective, with a penalty given by the ``$\ell_0$-norm'' of the signal's discrete graph gradient. The method uses a non-convex variant of proximal gradient descent, applying the alpha-expansion procedure to approximate the proximal mapping in each iteration, and using a geometric decay of the penalty parameter across iterations to ensure convergence. Under a cut-restricted isometry property for the measurement design, we prove global recovery guarantees for the estimated signal. For standard Gaussian designs, the required number of measurements is independent of the graph structure, and improves upon worst-case guarantees for total-variation (TV) compressed sensing on the 1-D line and 2-D lattice graphs by polynomial and logarithmic factors, respectively. The method empirically yields lower mean-squared recovery error compared with TV regularization in regimes of moderate undersampling and moderate to high signal-to-noise, for several examples of changepoint signals and gradient-sparse phantom images. Chapter 3 discusses my work with Zhou Fan and Sahand Negahban on tree-projected gradient descent for estimating gradient-sparse parameters. We consider estimating a gradient-sparse parameter $\boldsymbol{\theta}^*\in\mathbb{R}^p$, having strong gradient-sparsity $s^*:=\|\nabla_G \boldsymbol{\theta}^*\|_0$ on an underlying graph $G$. Given observations$Z_1,\ldots,Z_n$ and a smooth, convex loss function $\mathcal{L}$ for which our parameter of interest $\boldsymbol{\theta}^*$ minimizes the population risk $\mathbb{E}[\mathcal{L}(\btheta;Z_1,\ldots,Z_n)]$, we propose to estimate $\boldsymbol{\theta}^*$ by a projected gradient descent algorithm that iteratively and approximately projects gradient steps onto spaces of vectors having small gradient-sparsity over low-degree spanning trees of $G$. We show that, under suitable restricted strong convexity and smoothness assumptions for the loss, the resulting estimator achieves the squared-error risk $\frac{s^*}{n} \log (1+\frac{p}{s^*})$ up to a multiplicative constant that is independent of $G$. In contrast, previous polynomial-time algorithms have only been shown to achieve this guarantee in more specialized settings, or under additional assumptions for $G$ and/or the sparsity pattern of $\nabla_G \boldsymbol{\theta}^*$. As applications of our general framework, we apply our results to the examples of linear models and generalized linear models with random design. Chapter 4 discusses my joint work with Zhou Fan, Roy R. Lederman, Yi Sun, and Tianhao Wang on maximum likelihood for high-noise group orbit estimation. Motivated by applications to single-particle cryo-electron microscopy (cryo-EM), we study several problems of function estimation in a low SNR regime, where samples are observed under random rotations of the function domain. In a general framework of group orbit estimation with linear projection, we describe a stratification of the Fisher information eigenvalues according to a sequence of transcendence degrees in the invariant algebra, and relate critical points of the log-likelihood landscape to a sequence of method-of-moments optimization problems. This extends previous results for a discrete rotation group without projection. We then compute these transcendence degrees and the forms of these moment optimization problems for several examples of function estimation under $\mathsf{SO}(2)$ and $\mathsf{SO}(3)$ rotations. For several of these examples, we affirmatively resolve numerical conjectures that $3^\text{rd}$-order moments are sufficient to locally identify a generic signal up to its rotational orbit, and also confirm the existence of spurious local optima for the landscape of the population log-likelihood. For low-dimensional approximations of the electric potential maps of two small protein molecules, we empirically verify that the noise-scalings of the Fisher information eigenvalues conform with these theoretical predictions over a range of SNR, in a model of $\mathsf{SO}(3)$ rotations without projection.

COinS