Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Machine Learning and Compression

Deep Clustering with Associative Memories

Bishwajit Saha · Dmitry Krotov · Mohammed Zaki · Parikshit Ram


Abstract: Deep clustering -- joint representation learning and latent space clustering -- is a well studied problem especially in computer vision and text processing under the deep learning framework. While the representation learning is generally differentiable, clustering is an inherently discrete optimization, requiring various approximations and regularizations to fit in a standard differentiable pipeline. This leads to a somewhat disjointed representation learning and clustering. Recently, Associative Memories were utilized in the end-to-end differentiable$\texttt{ClAM}$ clustering scheme (Saha et al. 2023). In this work, we show how Associative Memories enable a novel take on deep clustering, $\texttt{DClAM}$, simplifying the whole pipeline and tying together the representation learning and clustering more intricately. Our experiments showcase the advantage of $\texttt{DClAM}$, producing improved clustering quality regardless of the architecture choice (convolutional, residual or fully-connected) or data modality (images or text).

Chat is not available.