Skip to yearly menu bar Skip to main content


Poster

Improving Adaptivity via Over-Parameterization in Sequence Models

Yicheng Li · Qian Lin

[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

It is well known that eigenfunctions of a kernel play a crucial role in kernel regression.Through several examples, we demonstrate that even with the same set of eigenfunctions, the order of these functions significantly impacts regression outcomes.Leveraging the Le Cam equivalence, a statistical equivalence between the sequence model and kernel regression, we introduce an over-parameterized gradient descent in the realm of sequence model to capture the effects of various orders of a fixed set of eigen-functions.This method is designed to explore the impact of varying eigenfunction orders.Our theoretical results show that the over-parameterization gradient flow can adapt to the underlying structure of the signal and significantly outperform the vanilla gradient flow method.Moreover, we also demonstrate that deeper over-parameterization can further enhance the generalization capability of the model.These results not only provide a new perspective on the benefits of over-parameterization and but also offer insights into the adaptivity and generalization potential of neural networks beyond the kernel regime.

Live content is unavailable. Log in and register to view live content