Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Learning Symmetry-Independent Jet Representations via Jet-Based Joint Embedding Predictive Architecture
Subash Katel · Haoyang Li · Zihan Zhao · Javier Duarte
In high energy physics, self-supervised learning methods have the potential to aid in the creation of machine learning models without the need for labeled datasets for a variety of tasks, including those related to jets---narrow sprays of particles produced by quarks and gluons in high energy particle collisions. This study introduces an approach to learning augmentation-independent jet representations using a Jet-based Joint Embedding Predictive Architecture (J-JEPA). This approach aims to predict various physical targets from an informative context, using target positions as joint information. As an augmentation-free method, J-JEPA avoids introducing biases that could harm downstream tasks, which often require invariance under augmentations different from those used in pretraining. This augmentation-independent training enables versatile applications, offering a pathway toward a cross-task foundation model. We fine-tuned the representations learned by J-JEPA for jet tagging and benchmark them against task-specific representations.