Skip to yearly menu bar Skip to main content


Poster

Differentiable Unsupervised Feature Selection based on a Gated Laplacian

Ofir Lindenbaum · Uri Shaham · Erez Peterfreund · Jonathan Svirsky · Nicolas Casey · Yuval Kluger

Virtual

Keywords: [ Graph Learning ] [ Clustering ]


Abstract:

Scientific observations may consist of a large number of variables (features). Selecting a subset of meaningful features is often crucial for identifying patterns hidden in the ambient space. In this paper, we present a method for unsupervised feature selection, and we demonstrate its advantage in clustering, a common unsupervised task. We propose a differentiable loss that combines a graph Laplacian-based score that favors low-frequency features with a gating mechanism for removing nuisance features. Our method improves upon the naive graph Laplacian score by replacing it with a gated variant computed on a subset of low-frequency features. We identify this subset by learning the parameters of continuously relaxed Bernoulli variables, which gate the entire feature space. We mathematically motivate the proposed approach and demonstrate that it is crucial to compute the graph Laplacian on the gated inputs rather than on the full feature set in the high noise regime. Using several real-world examples, we demonstrate the efficacy and advantage of the proposed approach over leading baselines.

Chat is not available.