Skip to yearly menu bar Skip to main content


Poster

Persistence Homology Distillation for Semi-supervised Continual Learning

YanFan · Yu Wang · Pengfei Zhu · Dongyue Chen · Qinghua Hu

[ ]
Thu 12 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Semi-supervised continual learning (SSCL) has attracted significant attention for addressing catastrophic forgetting in semi-supervised data. Knowledge distillation, which leverages data representation and pair-wise similarity, has shown significant potential in preserving information in SSCL. However, traditional distillation strategies often fail in unlabeled data with inaccurate or noisy information, limiting their efficiency in feature spaces undergoing substantial changes during continual learning. To address these limitations, we propose Persistence Homology Distillation (PsHD) to preserve intrinsic structural information that is insensitive to noise in semi-supervised continual learning. First, we capture the structural features using persistence homology by homological evolution across different scales in vision data, where the multi-scale characteristic established its stability under noise interference. Next, we propose a persistence homology distillation loss in SSCL and also design an efficient acceleration algorithm to reduce the computational cost of persistence homology in our module. Furthermore, we demonstrate the superior stability of PsHD compared to sample representation and pair-wise similarity distillation methods. Finally, experimental results on three widely used datasets validate that the new PsHD outperforms all baselines, with up to 3.2% improvements, and highlights the potential of utilizing unlabeled data in SSCL by reducing 50% memory buffer size.

Live content is unavailable. Log in and register to view live content