Skip to yearly menu bar Skip to main content


Poster

Hamiltonian Monte Carlo on ReLU Neural Networks is Inefficient

Vu Dinh · Lam Ho · Cuong V. Nguyen

[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract: We analyze the error rates of the Hamiltonian Monte Carlo algorithm with leapfrog integrator for Bayesian neural network inference. We show a new result that due to the non-differentiability of activation functions in the ReLU family, leapfrog HMC for networks with these activation functions has a large local error rate of $\Omega(\epsilon)$ rather than the classical error rate of $\mathcal{O}(\epsilon^3)$.This leads to a higher rejection rate of the proposals, making the method inefficient. We then verify our theoretical findings through empirical simulations, highlighting the inefficiency of HMC inference on ReLU-based neural networks compared to analytical networks.

Live content is unavailable. Log in and register to view live content