Speaker:
Institution:
Time:
Location:
The laws of classical mechanics, which describe the dynamics of moving bodies in physical space, have been a subject of great practical interest and inspired centuries of deep mathematical study. In particular, the perspective that describes the dynamics of a particle as a Hamiltonian flow in phase space has found useful applications in designing algorithms such as Hamiltonian Monte Carlo (HMC) in sampling. In this talk, I will describe a scheme of time-varying integration time for HMC so that it can have a provable acceleration, compared to using a fixed constant integration time, for sampling from a class of distributions. I will then switch to optimization and introduce an optimization algorithm called Hamiltonian Descent (HD), which can be viewed as a direct counterpart of HMC in sampling. When solving strongly convex quadratic problems, HD exhibits a novel update scheme that involves matrix-power-vector products. I will also cover Coordinate Hamiltonian Descent and its parallelizable variant, which turns out to encapsulate the classical Gauss-Seidel method, Successive Over-relaxation, Jacobi method, and more, for solving a linear system of equations.
The talk will be based on the following two papers. Joint work with Andre Wibisono (Yale).
1. Hamiltonian Descent and Coordinate Hamiltonian Descent. arXiv preprint.
2. Accelerating Hamiltonian Monte Carlo via Chebyshev Integration Time. ICLR 2023