Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Evolutionary and Transformer based methods for Symbolic Regression
Samyak Jha · Sergei Gleyzer · Eric Reinhardt · Victor Baules · Francois Charton · Nobuchika Okada
Abstract:
Symbolic regression aims to uncover mathematical expressions that fit data, traditionally using evolutionary algorithms like genetic programming. However, these methods often struggle with noise, impacting robustness. We propose two hybrid approaches that integrate genetic programming with transformer models to enhance performance. The first method partially initializes genetic programming (PIGP) with solutions from a pretrained transformer. The second approach employs symbolic Direct Preference Optimization (DPO), where a pretrained transformer generates candidate solutions via beam search, which are then refined by genetic programming. Preference pairs from the top solutions fine-tune the transformer to improve $R^2$ scores. Our experiments show that these transformer-based methods significantly enhance robustness in the presence of noise and perform comparably or better than traditional genetic programming methods, such as e-lexicase selection, in noise-free conditions. These findings highlight the potential of transformer-enhanced symbolic regression for improved model robustness and accuracy.
Chat is not available.