Poster
in
Workshop: NeuroAI: Fusing Neuroscience and AI for Intelligent Solutions
A Walsh Hadamard Derived Linear Vector Symbolic Architecture
Mohammad Mahmudul Alam · Alexander Oberle · Edward Raff · Stella Biderman · Tim Oates · James Holt
Abstract:
Vector Symbolic Architectures (VSAs) are one approach to developing Neuro-symbolic AI, where two vectors in $\mathbb{R}^d$ are 'bound' together to produce a new vector in the same space. VSAs support the commutativity and associativity of this binding operation, along with an inverse operation, allowing one to construct symbolic-style manipulations over real-valued vectors. Most VSAs were developed before deep learning and automatic differentiation became popular and instead focused on efficacy in hand-designed systems. In this work, we introduce the Hadamard-derived Linear Binding (HLB), which is designed to have favorable computational efficiency, efficacy in classic VSA tasks, and perform well in differentiable systems.
Chat is not available.