bboylyg/NAD Issues
Configuration
Closed 1A few question
Closed 1How does the attention loss work
Closed 1performance on GTSRB
Closed 1Results of BadNet and Fine-tuning
Closed 10How to get the teacher model?
Closed 3
This is an implementation demo of the ICLR 2021 paper [Neural Attention Distillation: Erasing Backdoor Triggers from Deep Neural Networks](https://openreview.net/pdf?id=9l0K4OM-oXE) in PyTorch.