Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeuroAI: Fusing Neuroscience and AI for Intelligent Solutions

Dyadic Learning in Recurrent and Feedforward Models

Rasmus Høier · Kirill Kalinin · Maxence Ernoult · Christopher Zach


Abstract:

From electrical to biological circuits, feedback plays a critical role in amplifying,dampening and stabilizing signals. In local activity difference based alternatives tobackpropagation, feedback connections are used to propagate learning signals indeep neural networks. We propose a saddle-point based framework using dyadic(two-state) neurons for training a family of parameterized models, which includethe symmetric Hopfield model, pure feedforward networks and a less exploredskew-symmetric Hopfield variant. The resulting learning method reduces to equi-librium propagation (EP) for symmetric Hopfield models and to dual propagation(DP) for feedforward networks, while the skew-symmetric Hopfield setting yieldsa new method with desirable robustness properties. Experimentally we demon-strate that the new skew-symmetric Hopfield model performs on par with EP andDP in terms of the resulting model predictive performance, while exhibiting en-hanced robustness to input changes and strong feedback and is less inclined toneural saturation. We identify the fundamentally different types of feedback sig-nals propagated in each model as the main cause of differences robustness andsaturation.

Chat is not available.