Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations

A New Geometric Approach of Adaptive Neighborhood Selection for Classification

Alexandre Levada · Frank Nielsen · Michel Ferreira Cardia Haddad

Keywords: [ Supervised classification ] [ Shape operator ] [ $k$-nearest neighbors ] [ Curvature ]


Abstract: The $k$-nearest neighbor ($k$-NN) is a widely adopted technique for nonparametric classification. However, the specification of the number of neighbors, $k$, often presents a challenge and highlights relevant constraints. Many desirable characteristics of a classifier - including the robustness to noise, smoothness of decision boundaries, bias-variance tradeoff, and management of class imbalance - are directly impacted by this parameter. In the present work, we describe an adaptive $k$-nearest-neighbors method that locally defines the neighborhood size by investigating the curvature of the sample. The rationale is that points with high curvature may have smaller neighbors (locally, the tangent space is a loose approximation) and points with low curvature may have larger neighborhoods (locally, the tangent space approximates the underlying data shape well). The results on several real-world data sets indicate that the new method outperforms the well-established $k$-NN approach.

Chat is not available.