Poster
Enabling certification of verification-agnostic networks via memory-efficient semidefinite programming
Sumanth Dathathri · Krishnamurthy Dvijotham · Alexey Kurakin · Aditi Raghunathan · Jonathan Uesato · Rudy Bunel · Shreya Shankar · Jacob Steinhardt · Ian Goodfellow · Percy Liang · Pushmeet Kohli
Poster Session 1 #316
Abstract:
Convex relaxations have emerged as a promising approach for verifying properties of neural networks, but widely used using Linear Programming (LP) relaxations only provide meaningful certificates when networks are specifically trained to facilitate verification. This precludes many important applications which involve \emph{verification-agnostic} networks that are not trained specifically to promote verifiability. On the other hand, semidefinite programming (SDP) relaxations have shown success on verification-agnostic networks, such as adversarially trained image classifiers without additional regularization, but do not currently scale beyond small networks due to poor time and space asymptotics. In this work, we propose a first-order dual SDP algorithm that provides (1) any-time bounds (2) requires memory only linear in the total number of network activations and (3) has per-iteration complexity that scales linearly with the complexity of a forward/backward pass through the network. By exploiting iterative eigenvector methods, we express all solver operations in terms of forward and backward passes through the network, enabling efficient use of hardware optimized for deep learning. This allows us to dramatically improve the magnitude of $\ell_\infty$ perturbations for which we can verify robustness verification-agnostic networks ($1\% \to 88\%$ on MNIST, $6\%\to 40\%$ on CIFAR-10). We also demonstrate tight verification for a quadratic stability specification for the decoder of a variational autoencoder.
Chat is not available.