Using noise to probe recurrent neural network structure and prune synapses
Eli Moore, Rishidev Chaudhuri
Spotlight presentation: Orals & Spotlights Track 02: COVID/Health/Bio Applications
on 2020-12-07T19:10:00-08:00 - 2020-12-07T19:20:00-08:00
on 2020-12-07T19:10:00-08:00 - 2020-12-07T19:20:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Many networks in the brain are sparsely connected, and the brain eliminates synapses during development and learning. How could the brain decide which synapses to prune? In a recurrent network, determining the importance of a synapse between two neurons is a difficult computational problem, depending on the role that both neurons play and on all possible pathways of information flow between them. Noise is ubiquitous in neural systems, and often considered an irritant to be overcome. Here we suggest that noise could play a functional role in synaptic pruning, allowing the brain to probe network structure and determine which synapses are redundant. We construct a simple, local, unsupervised plasticity rule that either strengthens or prunes synapses using only synaptic weight and the noise-driven covariance of the neighboring neurons. For a subset of linear and rectified-linear networks, we prove that this rule preserves the spectrum of the original matrix and hence preserves network dynamics even when the fraction of pruned synapses asymptotically approaches 1. The plasticity rule is biologically-plausible and may suggest a new role for noise in neural computation.