about the loss in the pgd_attack
lith0613 opened this issue · comments
The loss from CW in line 36 of pgd_attack.py use a negative sign, but there is not a such sign in the original CW loss, looking forward to your help
loss = -tf.nn.relu(correct_logit - wrong_logit + 50)
The original CW attack is stated as a minimization problem. Since our PGD attack maximizes the loss, we need to modify the CW loss to accommodate for that.