Skip to yearly menu bar Skip to main content


Poster

A Kernel Perspective on Distillation-based Collaborative Learning

Sejun Park · Kihun Hong · Ganguk Hwang

[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

In response to various situational demands, there is a growing interest in methodologies for enhancing AI models of multiple parties through collaboration.However, it is still challenging to achieve sufficient performance enhancement without sharing private data and models owned by individual parties.One recent promising approach is to develop distillation-based algorithms that exploit unlabeled public data but the results are still unsatisfactory in both theory and practice.To tackle this problem, we rigorously analyze a representative distillation-based algorithm in the view of kernel regression.This work provides the first theoretical results to prove the (nearly) minimax optimality of the privacy-preserving nonparametric collaborative learning algorithm in massively distributed statistically heterogeneous environments.Inspired by our theoretical results, we also propose a practical distillation-based collaborative learning algorithm based on neural network architecture.Our algorithm successfully bridges the gap between our theoretical assumptions and practical settings with neural networks through feature kernel matching.We simulate various regression tasks to verify our theory and demonstrate the practical feasibility of our proposed algorithm.

Live content is unavailable. Log in and register to view live content