Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations

Knowledge Distillation for Teaching Symmetry Invariances

Patrick Odagiu · Nicole Nobili · Fabian Dionys Schrag · Yves Bicker · Yuhui Ding

Keywords: [ Geometric Learning ] [ Knowledge Distillation ]


Abstract:

Knowledge distillation is used in an attempt to transfer model invariances related to specific symmetry transformations of the data. To this end, a model that exhibits such an invariance at the structural level is distilled into a simpler model that does not. The efficacy of knowledge distillation in transferring model invariances is empirically evaluated using four pairs of such networks, each pertaining to a different data invariance. Six metrics are reported; these determine how helpful the knowledge distillation is in general for the learning process and also specifically for learning the targeted invariance. It is observed that knowledge distillation fails at transferring invariances in the considered model pairs. Moreover, data augmentation shows a better performance at instilling invariances into a network.

Chat is not available.