Poster
in
Workshop: Table Representation Learning Workshop (TRL)
Learnable Numerical Input Normalization for Tabular Representation Learning based on B-splines
Min-Kook Suh · Moonjung Eo · Ye Seul Sim · Woohyung Lim
Keywords: [ Input normalization ] [ Tabular neural networks ]
Input normalization of numerical features is essential for improving the performance and training stability of neural networks. This is particularly important in tabular deep learning, where the heterogeneity of features requires different normalization for each input feature. While various normalization techniques have been proposed, they are often specialized for specific input feature distributions and, more importantly, are not optimized for neural network training. In this paper, we propose a learnable input normalization technique based on B-splines, which provides a flexible and differentiable curve-based transformation for each feature. An important characteristic of our method is its novel loss function, which adapts the transformation based on the difficulty of each data point, thereby boosting model performance on challenging samples. We evaluate the proposed method on OpenML datasets using several popular neural network architectures, demonstrating that the learnable normalization consistently outperforms conventional techniques.