Poster
in
Workshop: Meta-Learning
MPLP: Learning a Message Passing Learning Protocol
Ettore Randazzo
Abstract:
We present a novel method for learning the weights of an artificial neural network: a Message Passing Learning Protocol (MPLP). In MPLP, we abstract every operation occurring in ANNs as independent agents. Each agent is responsible for ingesting incoming multidimensional messages from other agents, updating its internal state, and generating multidimensional messages to be passed on to neighbouring agents. We demonstrate the viability of MPLP as opposed to traditional gradient-based approaches on simple feed-forward neural networks, and present a framework capable of generalizing to non-traditional neural network architectures. MPLP is meta learned using end-to-end gradient-based meta-optimisation.
Chat is not available.