Spectra of the Conjugate Kernel and Neural Tangent Kernel for linear-width neural networks
Zhou Fan, Zhichao Wang
Oral presentation: Orals & Spotlights Track 34: Deep Learning
on 2020-12-10T18:30:00-08:00 - 2020-12-10T18:45:00-08:00
on 2020-12-10T18:30:00-08:00 - 2020-12-10T18:45:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: We study the eigenvalue distributions of the Conjugate Kernel and Neural Tangent Kernel associated to multi-layer feedforward neural networks. In an asymptotic regime where network width is increasing linearly in sample size, under random initialization of the weights, and for input samples satisfying a notion of approximate pairwise orthogonality, we show that the eigenvalue distributions of the CK and NTK converge to deterministic limits. The limit for the CK is described by iterating the Marcenko-Pastur map across the hidden layers. The limit for the NTK is equivalent to that of a linear combination of the CK matrices across layers, and may be described by recursive fixed-point equations that extend this Marcenko-Pastur map. We demonstrate the agreement of these asymptotic predictions with the observed spectra for both synthetic and CIFAR-10 training data, and we perform a small simulation to investigate the evolutions of these spectra over training.