Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Table Representation Learning Workshop (TRL)

AGATa: Attention-Guided Augmentation for Tabular Data in Contrastive Learning

Moonjung Eo · Kyungeun Lee · Min-Kook Suh · Hyeseung Cho · Ye Seul Sim · Woohyung Lim

Keywords: [ Contrastive learning ] [ Input Augmentation ] [ Tabular domain ]


Abstract:

Contrastive learning has demonstrated significant potential across various domains, including recent applications to tabular data. However, adapting this approach to tabular structures presents distinct challenges, particularly in developing effective augmentation techniques. While existing methods have shown promise, there remains room for improvement in preserving critical feature relationships during the augmentation process. In this paper, we explore an alternative approach that utilizes attention scores to guide augmentation, aiming to introduce meaningful variations while maintaining important feature interactions. This method builds upon existing work in the field, offering a complementary perspective on tabular data augmentation for contrastive learning.Our approach explores two main aspects: 1) Attention-guided Feature Selection, which focuses augmentations on features with lower attention scores, and 2) Dynamic Augmentation Strategy, which alternates between different augmentation techniques during training. This combination aims to maintain key data characteristics while introducing diverse variations.Experimental results suggest that our method performs competitively with existing augmentation techniques in preserving tabular data structure and enhancing downstream task performance.

Chat is not available.