Tutorial
(Track1) There and Back Again: A Tale of Slopes and Expectations
Marc Deisenroth · Cheng Soon Ong
Integration and differentiation play key roles in machine learning.
We take a tour of some old and new results on methods and algorithms for integration and differentiation, in particular, for calculating expectations and slopes. We review numerical and Monte-Carlo integration for calculating expectations. We discuss the change-of-variables method leading to normalizing flows and discuss inference in time series to get there''. To get
back again'', we review gradients for calculating slopes by the chain rule and automatic differentiation, the basis for backpropagation in neural networks. We discuss backpropagation in three settings: in probabilistic graphical models, through an equality constraint, and with an inequality constraint.
To complete the round-trip, we explore algorithms for calculating gradients of expectations, the basis of methods for variational inference, reinforcement learning, and experimental design.