YyzHarry / ME-Net

[ICML 2019] ME-Net: Towards Effective Adversarial Robustness with Matrix Estimation

Home Page:http://me-net.csail.mit.edu

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

mask_train_cnt almost always be 1

MorningstarZhang opened this issue · comments

mask_train_cnt = math.ceil((batch_idx + 1) / (50000/batch_size))

Why there is a "math.ceil"?That seems to make "args.startp + mask_train_cnt*(args.endp-args.startp)/args.mask_num always equal to args.endp.

or maybe I should change mask_num to 10 as train_pure?

Also, hope for training details about adversarially training on MNIST and SVHN

Hi, I changed line 409 ,removed the math.ceil and the model I trained get 79.1 acc of chean data and 64.4 acc of PGD-7 white attack, which is slightly lower than the ckpt you provided (85.5/67.3). I don't know why that happend.
The problem can't be the modification I made because the p used in training will always be 0.6 in the current provided code train_adv.py. I wonder how many images are send to the model in each epoch? The args.mask_num is 1 in your setting ,which seems to be useless. However, the function get_data still has a loop to concatenate data. And in fact, my best model was got after epoch 71, adjust lr after epoch 100 doesn`t help. As in table 27 ,more masks help improve performance both on clean and adversarial data so I wonder if args.mask_num should be 10 ?

Yes, you can change the mask_num. Larger values should bring better results. The default number is for computation efficiency.