Skip to yearly menu bar Skip to main content


Poster

Deep Graph Neural Networks via Posteriori-Sampling-based Node-Adaptative Residual Module

Jingbo Zhou · Yixuan Du · Ruqiong Zhang · Jun Xia · Zhizhi Yu · Zelin Zang · Di Jin · Carl Yang · Rui Zhang · Stan Z. Li

East Exhibit Hall A-C #3001
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Graph Neural Networks (GNNs), a type of neural network that can learn from graph-structured data through neighborhood information aggregation, have shown superior performance in various downstream tasks. However, as the number of layers increases, node representations becomes indistinguishable, which is known as over-smoothing. To address this issue, many residual methods have emerged. In this paper, we focus on the over-smoothing issue and related residual methods. Firstly, we revisit over-smoothing from the perspective of overlapping neighborhood subgraphs, and based on this, we explain how residual methods can alleviate over-smoothing by integrating multiple orders neighborhood subgraphs to avoid the indistinguishability of the single high-order neighborhood subgraphs. Additionally, we reveal the drawbacks of previous residual methods, such as the lack of node adaptability and severe loss of high-order neighborhood subgraph information, and propose a \textbf{Posterior-Sampling-based, Node-Adaptive Residual module (PSNR)}. We theoretically demonstrate that PSNR can alleviate the drawbacks of previous residual methods. Furthermore, extensive experiments verify the superiority of the PSNR module in fully observed node classification and missing feature scenarios. Our codeis available at \href{https://github.com/jingbo02/PSNR-GNN}{https://github.com/jingbo02/PSNR-GNN}.

Live content is unavailable. Log in and register to view live content