Poster
in
Workshop: 5th Workshop on Self-Supervised Learning: Theory and Practice
Two Is Better Than One: Aligned Clusters Improve Anomaly Detection
Alain Ryser · Thomas Sutter · Alexander Marx · Julia Vogt
Abstract:
Anomaly detection focuses on identifying samples that deviate from the norm.When working with high-dimensional data such as images, a crucial requirement for detecting anomalous patterns is learning lower-dimensional representations that capture concepts of normality.Recent advances in self-supervised learning have shown great promise in this regard.However, many successful self-supervised anomaly detection methods assume prior knowledge about anomalies to create synthetic outliers during training.Yet, in real-world applications, we often do not know what to expect from unseen data, and we can solely leverage knowledge about normal data.In this work, we propose Con$_2$, which learns representations through context augmentations that model invariances of normal data while letting us observe samples from two distinct perspectives.At test time, representations of anomalies that do not adhere to these invariances deviate from the representation structure learned during training, allowing us to detect anomalies without relying on prior knowledge about them.
Chat is not available.