Skip to yearly menu bar Skip to main content


Oral presenter
in
Affinity Event: LatinX in AI

A Kernel Two-Sample Test with the Representation Jensen-Shannon Divergence

Jhoan Keider Hoyos

[ ]
Tue 10 Dec 9:10 a.m. PST — 9:20 a.m. PST

Abstract:

We introduce a novel kernel-based information-theoretic framework for two-sample testing, leveraging the representation Jensen-Shannon divergence (RJSD). RJSD captures higher-order information from covariance operators in reproducing Kernel Hilbert spaces and avoids Gaussianity assumptions, providing a robust and flexible measure of divergence between distributions. We develop RJSD-based variants of Maximum Mean Discrepancy (MMD) approaches, demonstrating superior discriminative power in extensive experiments on synthetic and real-world datasets. Our results position RJSD as a powerful alternative to MMD, with the potential to significantly impact kernel-based learning and distribution comparison. By establishing RJSD as a benchmark for two-sample testing, this work lays the foundation for future research in kernel-based divergence estimation and its broad range of applications in machine learning.

Live content is unavailable. Log in and register to view live content