Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Symmetry and Geometry in Neural Representations

Efficient Subgraph GNNs via Graph Products and Coarsening

Guy Bar-Shalom · Yam Eitan · Fabrizio Frasca · Haggai Maron

Keywords: [ Symmetries ] [ Subgraph GNNs ] [ Equivariance ]


Abstract:

Subgraph Graph Neural Networks (Subgraph GNNs) improve message-passing GNNs by representing graphs as a set of subgraphs, achieving strong performance, but their complexity limits applications to larger graphs. Previous methods use random or learnable sampling of subgraph subsets, but these lead to suboptimal subgraph selections or restricted subset sizes, causing performance drops. This paper presents a new framework to overcome these challenges. We use a graph coarsening function to cluster nodes into super-nodes with induced connectivity. The product of the coarsened and original graph reveals an implicit structure, where subgraphs are tied to specific node sets. By applying generalized message-passing to this graph product, we create an efficient and powerful Subgraph GNN. Unlike previous methods, our approach allows flexible subgraph selection and is compatible with standard training. Additionally, we uncover new permutation symmetries in the resulting node feature tensor, which we leverage by designing linear equivariant layers for our Subgraph GNN architecture. Extensive experiments on several datasets show our method is more flexible than previous approaches, effortlessly handling any number of subgraphs while consistently outperforming baselines.

Chat is not available.