Poster
Compositional PAC-Bayes: Generalization of GNNs with persistence and beyond
Kirill Brilliantov · Amauri Souza · Vikas Garg
Descriptors based on Persistent Homology (PH) are being increasingly integrated into Graph Neural Networks (GNNs) to augment them with rich topological features. However, the generalization of PH schemes remains unexplored. The heterogeneity of GNN layers and persistent vectorization components poses further key challenges in analyzing the generalization behavior of the overall model. We introduce a novel compositional PAC-Bayes framework to accommodate a broad spectrum of models including those with heterogeneous layers, providing the first data-dependent generalization bounds for a widely adopted PH vectorization scheme (that subsumes persistence landscapes, images, and silhouettes) as well as persistence-augmented GNNs. Our bounds also inform the design of novel regularizers. Existing bounds for GNNs and neural nets are recovered with ease. Empirical evaluations on several standard real-world datasets demonstrate that our bounds accurately predict the generalization performance, leading to improved classifier design via our regularizers. Overall, this work bridges a crucial gap in the theoretical understanding of PH methods and general heterogeneous models, paving the way for the design of better models for (graph) representation learning.
Live content is unavailable. Log in and register to view live content