Date of Award
Doctor of Philosophy (PhD)
Statistics and Data Science
Neural networks have been intensively studied as machine learning models and widely applied in various areas. This thesis investigates three problems related to the theory and application of neural networks. First, we analyze a learning scheme for neural networks that uses random weights in the backpropagation training algorithm, which is considered to be more biologically plausible than the standard training procedure. We establish theory that shows the convergence of the loss and the alignment between the forward weights of the network and the random weights used in the backward pass. Second, we study a family of optimization problems where the objective involves a trained generative network, with the goal of inverting the network. We introduce a novel algorithm that takes advantage of a sequential optimization technique to deal with the problem of non-convexity. The third part of this thesis is an application of modern neural network models to certain problems in neuroscience. We analyze data that contains two concurrent imaging modalities of the brain activity in mice, and build translation models to predict one modality the other. Our study is one of the first examples of advanced machine learning models applied to concurrent multi-model brain imaging data and demonstrates the potential of deep neural networks in the emerging area of neuroscience.
Song, Ganlin, "Learning, Optimization and Data Translation with Deep Neural Networks" (2021). Yale Graduate School of Arts and Sciences Dissertations. 417.