"Algorithm Dynamics in Modern Statistical Learning: Asymptotics, Univer" by Tianhao Wang

Date of Award

Spring 2024

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Statistics and Data Science

First Advisor

Fan, Zhou

Abstract

Understanding the dynamics of algorithms is crucial for characterizing the behavior of trained models in modern statistical learning. This thesis presents a few recent results on theoretical analyses of the dynamics of two classes of algorithms: Approximate Message Passing (AMP) algorithms and Stochastic Gradient Descent (SGD). For AMP algorithms, the focus is to derive the precise asymptotic distributional characterization of the iterates, known as the ``state evolution'' which summarizes the dynamics of AMP iterates, and to understand the universality of such characterization with respect to the underlying data distribution. For SGD, the goal is to perform trajectory analysis to understand its implicit regularization, a key property believed to be essential for the generalization of modern deep learning models. The results presented here provide unified frameworks for analyzing and understanding the dynamics of these algorithms, and can be potentially extended to other algorithms.

Share

COinS