Skip to yearly menu bar Skip to main content


Poster

Trade-Offs of Diagonal Fisher Information Matrix Estimators

Alexander Soen · Ke Sun

Poster Room - TBD
[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

The Fisher information matrix can be used to characterize the local geometry ofthe parameter space of neural networks. It elucidates insightful theories anduseful tools to understand and optimize neural networks. Given its highcomputational cost, practitioners often use random estimators and evaluate onlythe diagonal entries. We examine two popular estimators whose accuracy and samplecomplexity depend on their associated variances. We derive bounds of thevariances and instantiate them in neural networks for regression andclassification. We navigate trade-offs for both estimators based on analyticaland numerical studies. We find that the variance quantities depend on thenon-linearity w.r.t. different parameter groups and should not be neglected whenestimating the Fisher information.

Live content is unavailable. Log in and register to view live content