Skip to yearly menu bar Skip to main content


Poster

DiGRAF: Diffeomorphic Graph-Adaptive Activation Function

Krishna Sri Ipsit Mantri · Xinzhi Wang · Carola-Bibiane Schönlieb · Bruno Ribeiro · Beatrice Bevilacqua · Moshe Eliasof

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

In this paper, we propose a novel activation function tailored specifically for graph data in Graph Neural Networks (GNNs). Motivated by the need for graph-adaptive and flexible activation functions, we introduce DiGRAF, leveraging Continuous Piecewise-Affine Based (CPAB) transformations, which we augment with an additional GNN to learn a graph-adaptive diffeomorphic activation function in an end-to-end manner. In addition to its graph-adaptivity and flexibility, DiGRAF also possesses properties that are widely recognized as desirable for activation functions, such as differentiability, boundness within the domain and computational efficiency. We conduct an extensive set of experiments across diverse datasets and tasks, demonstrating a consistent and superior performance of DiGRAF compared to traditional and graph-specific activation functions, highlighting its effectiveness as an activation function for GNNs.

Live content is unavailable. Log in and register to view live content