Skip to yearly menu bar Skip to main content


Poster

Boosting Graph Pooling with Persistent Homology

Chaolong Ying · Xinjian Zhao · Tianshu Yu

[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Recently, there has been an emerging trend to integrate persistent homology (PH) into graph neural networks (GNNs) to enrich expressive power. However, naively plugging PH features into GNN layers always results in marginal improvement with low interpretability. In this paper, we investigate a novel mechanism for injecting global topological invariance into pooling layers using PH, motivated by the observation that filtration operation in PH naturally aligns graph pooling in a cut-off manner. In this fashion, message passing in the coarsened graph acts along persistent pooled topology, leading to improved performance. Experimentally, we apply our mechanism to a collection of graph pooling methods and observe consistent and substantial performance gain over several popular datasets, demonstrating its wide applicability and flexibility.

Live content is unavailable. Log in and register to view live content