Poster
Non-Stationary Learning of Neural Networks with Automatic Soft Parameter Reset
Alexandre Galashov · Michalis Titsias · András György · Clare Lyle · Razvan Pascanu · Yee Whye Teh · Maneesh Sahani
West Ballroom A-D #7200
Neural networks are most often trained under the assumption that data come from a stationary distribution. However, settings in which this assumption is violated are of increasing importance; examples include supervised learning with distributional shifts, reinforcement learning, continual learning and non-stationary contextual bandits. Here, we introduce a novel learning approach that automatically models and adapts to non-stationarity by linking parameters through an Ornstein-Uhlenbeck process with an adaptive drift parameter. The adaptive drift draws the parameters towards the distribution used at initialisation, so the approach can be understood as a form of soft parameter reset. We show empirically that our approach performs well in non-stationary supervised, and off-policy reinforcement learning settings.