Skip to yearly menu bar Skip to main content


Poster

Provable Editing of Deep Neural Networks using Parametric Linear Relaxation

Zhe Tao · Aditya V Thakur

East Exhibit Hall A-C #4411
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract: Ensuring that a DNN satisfies a desired property is critical when deploying DNNsin safety-critical applications. There are efficient methods that can verifywhether a DNN satisfies a property, as seen in the annual DNN verificationcompetition (VNN-COMP). However, the problem of provably editing a DNN tosatisfy a property remains challenging. We present PREPARED, the first efficienttechnique for provable editing of DNNs. Given a DNN $\mathcal{N}$with parameters $\theta$, input polytope $P$, and outputpolytope $Q$, PREPARED finds new parameters $\theta'$ such that $\forall\mathrm{x} \in P . \mathcal{N}(\mathrm{x}; \theta') \in Q$while minimizing the changes $\lVert{\theta' - \theta}\rVert$.Given a DNN and a property it violates from the VNN-COMP benchmarks,PREPARED is able to provably edit the DNN to satisfy this property within45 seconds.PREPARED is efficientbecause it relaxes the NP-hard provable editing problem to solving a linearprogram. The key contribution is the novel notion of Parametric Linear Relaxation,which enables PREPARED to construct tight output bounds of the DNN that are parameterizedby the new parameters $\theta'$. We demonstrate that PREPARED is more efficient andeffective compared to prior DNN editing approaches i) using the VNN-COMPbenchmarks, ii) by editing CIFAR10 and TinyImageNet image-recognition DNNs, andBERT sentiment-classification DNNs for local robustness, and iii) by training aDNN to model a geodynamics process and satisfy physics constraints.

Live content is unavailable. Log in and register to view live content