Evaluation under Real-world Distribution Shifts

  • Kumail Alhamoud

Student thesis: Master's Thesis

Abstract

Recent advancements in empirical and certified robustness have shown promising results in developing reliable and deployable Deep Neural Networks (DNNs). However, most evaluations of DNN robustness have focused on testing models on images from the same distribution they were trained on. In real-world scenarios, DNNs may encounter dynamic environments with significant distribution shifts. This thesis aims to investigate the interplay between empirical and certified adversarial robustness and domain generalization. We take the first step by training robust models on multiple domains and evaluating their accuracy and robustness on an unseen domain. Our findings reveal that: (1) both empirical and certified robustness exhibit generalization to unseen domains, and (2) the level of generalizability does not correlate strongly with the visual similarity of inputs, as measured by the Fr├ęchet Inception Distance (FID) between source and target domains. Furthermore, we extend our study to a real-world medical application, where we demonstrate that adversarial augmentation significantly enhances robustness generalization while minimally affecting accuracy on clean data. This research sheds light on the importance of evaluating DNNs under real-world distribution shifts and highlights the potential of adversarial augmentation in improving robustness in practical applications.
Date of AwardJul 2023
Original languageEnglish (US)
Awarding Institution
  • Computer, Electrical and Mathematical Sciences and Engineering
SupervisorBernard Ghanem (Supervisor)

Keywords

  • machine learning
  • computer vision
  • medical imaging
  • robustness

Cite this

'