Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences

S-KANformer: Enhancing Transformers for Symbolic Calculations in High Energy Physics

Ritesh Bhalerao · Eric Reinhardt · Sergei Gleyzer · Nobuchika Okada · Victor Baules


Abstract:

The cross-section is an integral part of the calculations involved in high-energy physics and the most time-consuming as well. In previous works, this problem was tackled using symbolic machine learning, particularly with vanilla transformer models. In this paper, we further explore the application of S-KANformer (transformer models infused with SineKAN layers). We present empirical evidence demonstrating that our model significantly outperforms the vanilla transformer in most tasks and shows greater robustness to varying factors such as batch size, dataset size, and sequence length. We also discuss some limitations and potential future directions of this work. Although more comprehensive studies need to be undertaken, this work shows promising directions for applications of S-KANformer, especially in domains involving symbolic calculations.

Chat is not available.