Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 5th Workshop on Self-Supervised Learning: Theory and Practice

Self-Supervised Pretext Tasks for Event Sequence Data from Detecting Misalignment

Yimu Wang · He Zhao · Ruizhi Deng · Fred Tung · Greg Mori


Abstract:

Pretext training followed by task-specific fine-tuning has been a successful approach in vision and language domains. This paper proposes a self-supervised pretext training framework tailored to event sequence data. We introduce novel auxiliary tasks (pretext tasks) that encourage the network to learn the coupling relationships between event times and types -- a previously untapped source of self-supervision without labels. These pretext tasks unlock foundational representations that are generalizable across different downstream tasks, including next-event prediction for temporal point process models, event sequence classification, and missing event interpolation. Experiments on popular public benchmarks demonstrate the potential of the proposed method across different tasks and data domains.

Chat is not available.