Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Time Series in the Age of Large Models

Preventing Conflicting Gradients in Neural Temporal Point Process Models for Irregular Time Series Data

Tanguy Bosser · Souhaib Ben Taieb


Abstract:

Neural Marked Temporal Point Processes (MTPP) are flexible models that are typically trained from large collections of sequences of irregularly-spaced labeled events. These models inherently learn two predictive distributions: one for the arrival times of events and another for the types of events, also known as marks. In this study, we demonstrate that learning an MTPP model can be framed as a two-task learning problem, where both tasks share a common set of trainable parameters that are optimized jointly. We show that this practice can lead to conflicting gradients during training, resulting in overall degraded performance with respect to both tasks. To overcome this issue, we introduce novel parametrizations for neural MTPP models that allow for separate modeling and training of each task, effectively avoiding the problem of conflicting gradients.

Chat is not available.