Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Adaptive Foundation Models: Evolving AI for Personalized and Efficient Learning

Dream To Adapt: Learning Behaviors by Latent Imagination Under Non-Stationarity

Emiliyan Gospodinov · Vaisakh Shaj Kumar · Philipp Becker · Stefan Geyer · Gerhard Neumann


Abstract:

Developing foundational world models is a key research direction for embodied intelligence, with the ability to adapt to non-stationary environments being a crucial criterion. In this work, we introduce a new formalism, Hidden Parameter-POMDP, designed for control with adaptive world models. We demonstrate that this approach enables learning robust behaviors across a variety of non-stationary RL benchmarks. Additionally, this formalism effectively learns task abstractions in an unsupervised manner, resulting in structured, task-aware latent spaces.

Chat is not available.