Date of Award

Spring 6-9-2024

Document Type

Thesis (Undergraduate)

Department

Computer Science

First Advisor

Soroush Vosoughi

Abstract

We present knowledge continuity, a novel definition inspired by Lipschitz continuity which aims to certify the robustness of neural networks across input domains (such as continuous and discrete domains in vision and language, respectively). Most existing approaches that seek to certify robustness, especially Lipschitz continuity, lie within the continuous domain with norm and distribution-dependent guarantees. In contrast, our proposed definition yields certification guarantees that depend only on the loss function and the intermediate learned metric spaces of the neural network. These bounds are independent of domain modality, norms, and distribution. We further demonstrate that the expressiveness of a model class is not at odds with its knowledge continuity. This implies that achieving robustness by maximizing knowledge continuity should not theoretically hinder inferential performance. Finally, we present several applications of knowledge continuity such as regularization and show that knowledge continuity can also localize vulnerable components of a neural network.

Available for download on Friday, May 30, 2025

Share

COinS