Spotlight
in
Workshop: The Symbiosis of Deep Learning and Differential Equations -- III
Adaptive Resolution Residual Networks
Léa Demeule · Mahtab Sandhu · Glen Berseth
Keywords: [ Neural operator ] [ Laplacian pyramid ] [ Laplacian dropout ] [ Laplacian residual ] [ Convolutional Network ] [ Rediscretization ] [ Adaptive resolution ] [ Residual Network ]
We introduce Adaptive Resolution Residual Networks (ARRNs), a form of neural operator that enables the creation of networks for signal-based tasks that can be rediscretized to suit any signal resolution. ARRNs are composed of a chain of Laplacian residuals that each contain ordinary layers, which do not need to be rediscretizable for the whole network to be rediscretizable. ARRNs have the property of requiring a lower number of Laplacian residuals for exact evaluation on lower resolution signals, which greatly reduces computational cost. ARRNs also implement Laplacian dropout, which encourages networks to become robust to low-bandwidth signals. ARRNs can thus be trained once at high-resolution and then be rediscretized on the fly at the suitable resolution with great robustness.