Poster
Achieving Domain-Independent Certified Robustness via Knowledge Continuity
Alan Sun · Chiyu Ma · Kenneth Ge · Soroush Vosoughi
We present knowledge continuity, a novel definition inspired by Lipschitz continuity which aims to certify the robustness of neural networks across input domains (such as continuous and discrete domains in vision and language, respectively). Most existing approaches that seek to certify robustness, especially Lipschitz continuity, lie within the continuous domain with norm and distribution-dependent guarantees. In contrast, our proposed definition yields certification guarantees that depend only on the loss function and the intermediate learned metric spaces of the neural network. These bounds are independent of domain modality, norms, and distribution. We further demonstrate that the expressiveness of a model class is not at odds with its knowledge continuity. This implies that achieving robustness by maximizing knowledge continuity should not theoretically hinder inferential performance. Finally, we present several applications of knowledge continuity such as regularization and show that knowledge continuity can also localize vulnerable components of a neural network.
Live content is unavailable. Log in and register to view live content