Poster
in
Workshop: Symmetry and Geometry in Neural Representations
Rethinking Message Passing for Algorithmic Alignment
Joël Mathys · Florian Grötschla · Kalyan Nadimpalli · Roger Wattenhofer
Keywords: [ Algorithmic Alignment ] [ Size Generalization ] [ Message Passing ]
Most Graph Neural Networks are based on the principle of message-passing, where all neighboring nodes exchange messages with each other simultaneously. We want to challenge this paradigm by introducing the Flood and Echo Net, a novel architecture that aligns neural computation with the principles of distributed algorithms. In our method, nodes sparsely activate upon receiving a message, leading to a wave-like activation pattern that traverses the graph. Through these sparse but parallel activations, the Net becomes more expressive than traditional MPNNs which are limited by the 1-WL test and also is provably more efficient in terms of message complexity.Moreover, the mechanism's ability to generalize across graphs of varying sizes positions it as a practical architecture for the task of algorithmic learning. We test the Flood and Echo Net on a variety of synthetic tasks and find that the algorithmic alignment of the execution improves generalization to larger graph sizes.