Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

Dimensionality Reduction Techniques for Global Bayesian Optimisation

Luo Long · Coralia Cartis · Paz Fink Shustin


Abstract:

Bayesian Optimisation (BO) is a state-of-the-art global optimisation technique for black-box prob- lems where derivative information is unavailable and sample efficiency is crucial. However, im- proving the general scalability of BO has proved challenging. Here, we explore Latent Space Bayesian Optimisation (LSBO), that applies dimensionality reduction to perform BO in a reduced- dimensional subspace. While early LSBO methods used (linear) random projections (Wang et al., 2013 [26]), we employ Variational Autoencoders (VAEs) to manage more complex data structures and general DR tasks. Building on Grosnit et al. (2021) [11], we analyse the VAE-based LSBO framework, focusing on VAE retraining and deep metric loss. We suggest a few key corrections in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes. Our numerical results show that structured latent manifolds improve BO performance. Additionally, we examine the use of the Mate ́rn-25 kernel for Gaussian Processes in this LSBO context. We also integrate Sequential Domain Reduction (SDR), a standard global optimization efficiency strategy, into BO. SDR is included in a GPU-based en- vironment using BoTorch, both in the original and VAE-generated latent spaces, marking the first application of SDR within LSBO.

Chat is not available.