Date of Award

Spring 2022

Document Type


Degree Name

Doctor of Philosophy (PhD)


Statistics and Data Science

First Advisor

Lederman, Roy


This thesis considers a variety of topics broadly unified under the theme of geometric integration for Riemannian manifold Hamiltonian Monte Carlo. In chapter 2, we review fundamental topics in numerical computing (section 2.1), classical mechanics (section 2.2), integration on manifolds (section 2.3), Riemannian geometry (section 2.5), stochastic differential equations (section 2.4), information geometry (section 2.6), and Markov chain Monte Carlo (section 2.7). The purpose of these sections is to present the topics discussed in the thesis within a broader context. The subsequent chapters are self-contained to an extent, but contain references back to this foundational material where appropriate. Chapter 3 gives a formal means of conceptualizing the Markov chains corresponding to Riemannian manifold Hamiltonian Monte Carlo and related methods; this formalism is useful for understanding the significance of reversibility and volume-preservation for maintaining detailed balance in Markov chain Monte Carlo. Throughout the remainder of the thesis, we investigate alternative methods of geometric numerical integration for use in Riemannian manifold Hamiltonian Monte Carlo, discuss numerical issues involving violations of reversibility and detailed balance, and propose new algorithms with superior theoretical foundations. In chapter 4, we evaluate the implicit midpoint integrator for Riemannian manifold Hamiltonian Monte Carlo, presenting the first time that this integrator has been deployed and assessed within this context. We discuss attributes of the implicit midpoint integrator that make it preferable, and inferior, to alternative methods of geometric integration such as the generalized leapfrog procedure. In chapter 5, we treat an empirical question as to what extent convergence thresholds play a role in geometric numerical integration in Riemannian manifold Hamiltonian Monte Carlo. If the convergence threshold is too large, then the Markov chain transition kernel will fail to maintain detailed balance, whereas a convergence threshold that is very small will incur computational penalties. We investigate these phenomena and suggest two mechanisms, based on stochastic approximation and higher-order solvers for non-linear equations, which can aid in identifying convergence thresholds or suppress its significance. In chapter 6, we consider a numerical integrator for Markov chain Monte Carlo based on the Lagrangian, rather than Hamiltonian, formalism in classical mechanics. Our contributions include clarifying the order of accuracy of this numerical integrator, which has been misunderstood in the literature, and evaluating a simple change that can accelerate the implementation of the method, but which comes at the cost of producing more serially auto-correlated samples. We also discuss robustness properties of the Lagrangian numerical method that do not materialize in the Hamiltonian setting. Chapter 7 examines theories of geometric ergodicity for Riemannian manifold Hamiltonian Monte Carlo and Lagrangian Monte Carlo, and proposes a simple modification to these Markov chain methods that enables geometric ergodicity to be inherited from the manifold Metropolis-adjusted Langevin algorithm. In chapter 8, we show how to revise an explicit integration using a theory of Lagrange multipliers so that the resulting numerical method satisfies the properties of reversibility and volume-preservation. Supplementary content in chapter E investigates topics in the theory of shadow Hamiltonians of the implicit midpoint method in the case of non-canonical Hamiltonian mechanics and chapter F, which treats the continual adaptation of a parameterized proposal distribution in the independent Metropolis-Hastings sampler.