Contributed Talk - Lightning
in
Workshop: Symmetry and Geometry in Neural Representations (NeurReps)
Expander Graph Propagation
Andreea Deac · Marc Lackenby · Petar Veličković
Keywords: [ graph neural networks ] [ Graph Representation Learning ] [ expander graphs ] [ bottlenecks ] [ oversquashing ] [ cayley graphs ] [ graph machine learning ] [ Curvature ] [ Group Theory ]
Deploying graph neural networks (GNNs) on whole-graph classification or regression tasks is challenging, often requiring node features that are mindful of both local interactions and the graph global context. GNN architectures need to avoid pathological behaviours, such as bottlenecks and oversquashing, while ideally having linear time and space complexity requirements. In this work, we propose an elegant approach based on propagating information over expander graphs. We provide an efficient method for constructing expander graphs of a given size, and use this insight to propose the EGP model. We show that EGP is able to address all of the above concerns, while requiring minimal effort to set up, and provide evidence of its empirical utility on relevant datasets and baselines in the Open Graph Benchmark. Importantly, using expander graphs as a template for message passing necessarily gives rise to negative curvature. While this appears to be counterintuitive in light of recent related work on oversquashing, we theoretically demonstrate that negatively curved edges are likely to be required to obtain scalable message passing without bottlenecks.