mr-eggplant / SAR

Code for ICLR 2023 paper (Oral) — Towards Stable Test-Time Adaptation in Dynamic Wild World

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

experiment results of SAR is lower than reported in paper

Cccanvas opened this issue · comments

Hi, thanks for your great work.
I clone the code and try to reproduce the results in table 2. However, the results of SAR is lower than reported in paper. (The results of TENT is the same as that in paper).

I run the experiment in 4 blur corruptions (defocus_blur, glass_blur, motion_blur, zoom_blur).

My training command is:
python main.py --data_corruption /home/cz/data/imagenet/ --exp_type label_shifts --method tent --model vitbase_timm --output ./outputs/tent
python main.py --data_corruption /home/cz/data/imagenet/ --exp_type label_shifts --method sar --model vitbase_timm --output ./outputs/sar

Results of TENT is:

2023-07-24 01:19:06,132 INFO : this exp is for label shifts, no need to shuffle the dataloader, use our pre-defined sample order
2023-07-24 01:19:06,393 INFO : imbalance ratio is 500000
2023-07-24 01:19:06,394 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 01:19:18,428 INFO : Namespace(corruption='defocus_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-19-06-tent-vitbase_timm-level5-seed2021.txt', lr=0.001, method='tent', model='vitbase_timm', output='./outputs/tent', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 01:19:18,432 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias', 'blocks.9.norm1.weight', 'blocks.9.norm1.bias', 'blocks.9.norm2.weight', 'blocks.9.norm2.bias', 'blocks.10.norm1.weight', 'blocks.10.norm1.bias', 'blocks.10.norm2.weight', 'blocks.10.norm2.bias', 'blocks.11.norm1.weight', 'blocks.11.norm1.bias', 'blocks.11.norm2.weight', 'blocks.11.norm2.bias', 'norm.weight', 'norm.bias']
2023-07-24 01:39:25,591 INFO : Result under defocus_blur. The adapttion accuracy of Tent is top1 54.37700 and top5: 77.98100
2023-07-24 01:39:25,592 INFO : acc1s are [54.37699890136719]
2023-07-24 01:39:25,592 INFO : acc5s are [77.98099517822266]
2023-07-24 01:39:25,846 INFO : imbalance ratio is 500000
2023-07-24 01:39:25,846 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 01:39:43,015 INFO : Namespace(corruption='glass_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-19-06-tent-vitbase_timm-level5-seed2021.txt', lr=0.001, method='tent', model='vitbase_timm', output='./outputs/tent', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 01:39:43,019 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias', 'blocks.9.norm1.weight', 'blocks.9.norm1.bias', 'blocks.9.norm2.weight', 'blocks.9.norm2.bias', 'blocks.10.norm1.weight', 'blocks.10.norm1.bias', 'blocks.10.norm2.weight', 'blocks.10.norm2.bias', 'blocks.11.norm1.weight', 'blocks.11.norm1.bias', 'blocks.11.norm2.weight', 'blocks.11.norm2.bias', 'norm.weight', 'norm.bias']
2023-07-24 01:59:49,538 INFO : Result under glass_blur. The adapttion accuracy of Tent is top1 52.10900 and top5: 75.50600
2023-07-24 01:59:49,538 INFO : acc1s are [54.37699890136719, 52.1089973449707]
2023-07-24 01:59:49,539 INFO : acc5s are [77.98099517822266, 75.50599670410156]
2023-07-24 01:59:49,785 INFO : imbalance ratio is 500000
2023-07-24 01:59:49,785 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 01:59:55,613 INFO : Namespace(corruption='motion_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-19-06-tent-vitbase_timm-level5-seed2021.txt', lr=0.001, method='tent', model='vitbase_timm', output='./outputs/tent', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 01:59:55,617 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias', 'blocks.9.norm1.weight', 'blocks.9.norm1.bias', 'blocks.9.norm2.weight', 'blocks.9.norm2.bias', 'blocks.10.norm1.weight', 'blocks.10.norm1.bias', 'blocks.10.norm2.weight', 'blocks.10.norm2.bias', 'blocks.11.norm1.weight', 'blocks.11.norm1.bias', 'blocks.11.norm2.weight', 'blocks.11.norm2.bias', 'norm.weight', 'norm.bias']
2023-07-24 02:20:02,500 INFO : Result under motion_blur. The adapttion accuracy of Tent is top1 58.14200 and top5: 80.62900
2023-07-24 02:20:02,500 INFO : acc1s are [54.37699890136719, 52.1089973449707, 58.141998291015625]
2023-07-24 02:20:02,500 INFO : acc5s are [77.98099517822266, 75.50599670410156, 80.62899780273438]
2023-07-24 02:20:02,756 INFO : imbalance ratio is 500000
2023-07-24 02:20:02,756 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 02:20:05,444 INFO : Namespace(corruption='zoom_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-19-06-tent-vitbase_timm-level5-seed2021.txt', lr=0.001, method='tent', model='vitbase_timm', output='./outputs/tent', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 02:20:05,448 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias', 'blocks.9.norm1.weight', 'blocks.9.norm1.bias', 'blocks.9.norm2.weight', 'blocks.9.norm2.bias', 'blocks.10.norm1.weight', 'blocks.10.norm1.bias', 'blocks.10.norm2.weight', 'blocks.10.norm2.bias', 'blocks.11.norm1.weight', 'blocks.11.norm1.bias', 'blocks.11.norm2.weight', 'blocks.11.norm2.bias', 'norm.weight', 'norm.bias']
2023-07-24 02:40:12,053 INFO : Result under zoom_blur. The adapttion accuracy of Tent is top1 52.10100 and top5: 75.84200
2023-07-24 02:40:12,053 INFO : acc1s are [54.37699890136719, 52.1089973449707, 58.141998291015625, 52.10099792480469]
2023-07-24 02:40:12,053 INFO : acc5s are [77.98099517822266, 75.50599670410156, 80.62899780273438, 75.84199523925781]

The results of SAR is:

2023-07-24 01:22:36,198 INFO : this exp is for label shifts, no need to shuffle the dataloader, use our pre-defined sample order
2023-07-24 01:22:36,478 INFO : imbalance ratio is 500000
2023-07-24 01:22:36,478 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 01:22:54,412 INFO : Namespace(corruption='defocus_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-22-36-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 01:22:54,416 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
2023-07-24 02:03:32,173 INFO : Result under defocus_blur. The adaptation accuracy of SAR is top1: 29.08600 and top5: 48.70900
2023-07-24 02:03:32,173 INFO : acc1s are [29.08599853515625]
2023-07-24 02:03:32,173 INFO : acc5s are [48.70899963378906]
2023-07-24 02:03:32,417 INFO : imbalance ratio is 500000
2023-07-24 02:03:32,417 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 02:03:40,059 INFO : Namespace(corruption='glass_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-22-36-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 02:03:40,063 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
2023-07-24 02:44:17,940 INFO : Result under glass_blur. The adaptation accuracy of SAR is top1: 23.36000 and top5: 41.38300
2023-07-24 02:44:17,941 INFO : acc1s are [29.08599853515625, 23.35999870300293]
2023-07-24 02:44:17,941 INFO : acc5s are [48.70899963378906, 41.382999420166016]
2023-07-24 02:44:18,186 INFO : imbalance ratio is 500000
2023-07-24 02:44:18,186 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 02:44:24,259 INFO : Namespace(corruption='motion_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-22-36-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 02:44:24,263 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
2023-07-24 03:25:00,744 INFO : Result under motion_blur. The adaptation accuracy of SAR is top1: 33.95400 and top5: 54.65100
2023-07-24 03:25:00,744 INFO : acc1s are [29.08599853515625, 23.35999870300293, 33.95399856567383]
2023-07-24 03:25:00,745 INFO : acc5s are [48.70899963378906, 41.382999420166016, 54.650997161865234]
2023-07-24 03:25:00,988 INFO : imbalance ratio is 500000
2023-07-24 03:25:00,988 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 03:25:03,661 INFO : Namespace(corruption='zoom_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/home/cz/data/imagenet/', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-01-22-36-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 03:25:03,665 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
2023-07-24 04:05:41,512 INFO : Result under zoom_blur. The adaptation accuracy of SAR is top1: 27.04700 and top5: 46.20700
2023-07-24 04:05:41,513 INFO : acc1s are [29.08599853515625, 23.35999870300293, 33.95399856567383, 27.046998977661133]
2023-07-24 04:05:41,513 INFO : acc5s are [48.70899963378906, 41.382999420166016, 54.650997161865234, 46.207000732421875]

SAR only get [29.1 23.5 33.9 27.0] in 4 blur corruption.

Did I miss something? Looking forward to your reply.

Did u modify the code? I just re-cloned the code and run your command, the results are as follows, the same as that in the paper:

python main.py --data_corruption ///dataset/imagenet-c --exp_type label_shifts --method sar --model vitbase_timm --output ./outputs/sar
2023-07-24 09:51:03,819 INFO : this exp is for label shifts, no need to shuffle the dataloader, use our pre-defined sample order
Test on gaussian_noise level 5
2023-07-24 09:51:03,983 INFO : imbalance ratio is 500000
2023-07-24 09:51:03,983 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 09:51:06,896 INFO : Namespace(corruption='gaussian_noise', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 09:51:06,898 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 2.589 ( 2.589) Acc@1 7.81 ( 7.81) Acc@5 17.19 ( 17.19)
Test: [ 39/1563] Time 0.601 ( 0.672) Acc@1 18.75 ( 25.82) Acc@5 46.88 ( 44.65)
Test: [ 78/1563] Time 0.601 ( 0.637) Acc@1 56.25 ( 26.98) Acc@5 78.12 ( 47.43)
Test: [ 117/1563] Time 0.601 ( 0.625) Acc@1 21.88 ( 31.86) Acc@5 53.12 ( 53.14)
Test: [ 156/1563] Time 0.588 ( 0.619) Acc@1 26.56 ( 35.02) Acc@5 59.38 ( 57.36)
Test: [ 195/1563] Time 0.600 ( 0.615) Acc@1 35.94 ( 35.32) Acc@5 56.25 ( 57.85)
Test: [ 234/1563] Time 0.603 ( 0.613) Acc@1 37.50 ( 37.69) Acc@5 57.81 ( 60.33)
Test: [ 273/1563] Time 0.602 ( 0.612) Acc@1 29.69 ( 37.21) Acc@5 64.06 ( 61.15)
Test: [ 312/1563] Time 0.590 ( 0.611) Acc@1 50.00 ( 38.12) Acc@5 59.38 ( 62.38)
Test: [ 351/1563] Time 0.603 ( 0.610) Acc@1 57.81 ( 39.02) Acc@5 78.12 ( 63.22)
Test: [ 390/1563] Time 0.603 ( 0.609) Acc@1 15.62 ( 39.63) Acc@5 48.44 ( 64.10)
Test: [ 429/1563] Time 0.603 ( 0.609) Acc@1 56.25 ( 40.51) Acc@5 85.94 ( 64.91)
Test: [ 468/1563] Time 0.603 ( 0.608) Acc@1 64.06 ( 41.24) Acc@5 82.81 ( 65.64)
Test: [ 507/1563] Time 0.604 ( 0.608) Acc@1 75.00 ( 42.05) Acc@5 90.62 ( 66.19)
Test: [ 546/1563] Time 0.604 ( 0.608) Acc@1 56.25 ( 42.98) Acc@5 79.69 ( 66.88)
Test: [ 585/1563] Time 0.604 ( 0.607) Acc@1 35.94 ( 43.46) Acc@5 60.94 ( 67.39)
Test: [ 624/1563] Time 0.604 ( 0.607) Acc@1 43.75 ( 43.96) Acc@5 76.56 ( 67.80)
Test: [ 663/1563] Time 0.604 ( 0.607) Acc@1 59.38 ( 44.65) Acc@5 75.00 ( 68.31)
Test: [ 702/1563] Time 0.603 ( 0.607) Acc@1 70.31 ( 45.37) Acc@5 81.25 ( 68.88)
Test: [ 741/1563] Time 0.604 ( 0.607) Acc@1 51.56 ( 45.82) Acc@5 75.00 ( 69.27)
Test: [ 780/1563] Time 0.604 ( 0.606) Acc@1 54.69 ( 45.98) Acc@5 84.38 ( 69.54)
Test: [ 819/1563] Time 0.604 ( 0.606) Acc@1 84.38 ( 46.30) Acc@5 93.75 ( 69.94)
Test: [ 858/1563] Time 0.604 ( 0.606) Acc@1 67.19 ( 46.68) Acc@5 79.69 ( 70.35)
Test: [ 897/1563] Time 0.603 ( 0.606) Acc@1 70.31 ( 47.04) Acc@5 87.50 ( 70.63)
Test: [ 936/1563] Time 0.593 ( 0.606) Acc@1 50.00 ( 47.36) Acc@5 81.25 ( 70.94)
Test: [ 975/1563] Time 0.603 ( 0.606) Acc@1 45.31 ( 47.65) Acc@5 67.19 ( 71.25)
Test: [1014/1563] Time 0.604 ( 0.606) Acc@1 39.06 ( 47.89) Acc@5 95.31 ( 71.56)
Test: [1053/1563] Time 0.604 ( 0.606) Acc@1 54.69 ( 48.17) Acc@5 79.69 ( 71.96)
Test: [1092/1563] Time 0.604 ( 0.606) Acc@1 70.31 ( 48.38) Acc@5 87.50 ( 72.10)
Test: [1131/1563] Time 0.604 ( 0.606) Acc@1 46.88 ( 48.58) Acc@5 70.31 ( 72.28)
Test: [1170/1563] Time 0.604 ( 0.605) Acc@1 62.50 ( 48.82) Acc@5 82.81 ( 72.46)
Test: [1209/1563] Time 0.604 ( 0.605) Acc@1 26.56 ( 48.70) Acc@5 67.19 ( 72.54)
Test: [1248/1563] Time 0.604 ( 0.605) Acc@1 64.06 ( 48.84) Acc@5 90.62 ( 72.59)
Test: [1287/1563] Time 0.604 ( 0.605) Acc@1 50.00 ( 48.95) Acc@5 70.31 ( 72.68)
Test: [1326/1563] Time 0.594 ( 0.605) Acc@1 60.94 ( 49.24) Acc@5 68.75 ( 72.87)
Test: [1365/1563] Time 0.616 ( 0.605) Acc@1 25.00 ( 49.17) Acc@5 57.81 ( 72.87)
Test: [1404/1563] Time 0.603 ( 0.605) Acc@1 73.44 ( 49.43) Acc@5 90.62 ( 73.10)
Test: [1443/1563] Time 0.616 ( 0.605) Acc@1 51.56 ( 49.48) Acc@5 75.00 ( 73.25)
Test: [1482/1563] Time 0.604 ( 0.605) Acc@1 40.62 ( 49.69) Acc@5 65.62 ( 73.39)
Test: [1521/1563] Time 0.615 ( 0.605) Acc@1 45.31 ( 49.65) Acc@5 70.31 ( 73.43)
Test: [1560/1563] Time 0.603 ( 0.605) Acc@1 85.94 ( 49.71) Acc@5 89.06 ( 73.54)
2023-07-24 10:06:52,274 INFO : Result under gaussian_noise. The adaptation accuracy of SAR is top1: 49.68900 and top5: 73.53100
2023-07-24 10:06:52,275 INFO : acc1s are [49.68899917602539]
2023-07-24 10:06:52,275 INFO : acc5s are [73.53099822998047]
Test on shot_noise level 5
2023-07-24 10:06:52,444 INFO : imbalance ratio is 500000
2023-07-24 10:06:52,444 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 10:06:53,848 INFO : Namespace(corruption='shot_noise', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 10:06:53,850 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 0.712 ( 0.712) Acc@1 4.69 ( 4.69) Acc@5 12.50 ( 12.50)
Test: [ 39/1563] Time 0.591 ( 0.605) Acc@1 21.88 ( 21.52) Acc@5 56.25 ( 40.27)
Test: [ 78/1563] Time 0.615 ( 0.604) Acc@1 32.81 ( 24.33) Acc@5 64.06 ( 44.98)
Test: [ 117/1563] Time 0.591 ( 0.604) Acc@1 17.19 ( 28.69) Acc@5 42.19 ( 49.18)
Test: [ 156/1563] Time 0.615 ( 0.604) Acc@1 31.25 ( 31.81) Acc@5 53.12 ( 53.34)
Test: [ 195/1563] Time 0.603 ( 0.604) Acc@1 32.81 ( 32.83) Acc@5 59.38 ( 55.07)
Test: [ 234/1563] Time 0.616 ( 0.604) Acc@1 28.12 ( 35.22) Acc@5 64.06 ( 57.60)
Test: [ 273/1563] Time 0.603 ( 0.604) Acc@1 40.62 ( 34.73) Acc@5 70.31 ( 58.41)
Test: [ 312/1563] Time 0.604 ( 0.603) Acc@1 46.88 ( 35.98) Acc@5 53.12 ( 59.93)
Test: [ 351/1563] Time 0.604 ( 0.603) Acc@1 67.19 ( 37.18) Acc@5 81.25 ( 61.08)
Test: [ 390/1563] Time 0.604 ( 0.603) Acc@1 26.56 ( 38.12) Acc@5 51.56 ( 62.27)
Test: [ 429/1563] Time 0.602 ( 0.603) Acc@1 56.25 ( 39.11) Acc@5 84.38 ( 63.23)
Test: [ 468/1563] Time 0.604 ( 0.603) Acc@1 46.88 ( 39.61) Acc@5 75.00 ( 63.83)
Test: [ 507/1563] Time 0.603 ( 0.603) Acc@1 75.00 ( 40.48) Acc@5 85.94 ( 64.59)
Test: [ 546/1563] Time 0.604 ( 0.603) Acc@1 54.69 ( 41.40) Acc@5 65.62 ( 65.27)
Test: [ 585/1563] Time 0.603 ( 0.603) Acc@1 32.81 ( 41.92) Acc@5 71.88 ( 65.89)
Test: [ 624/1563] Time 0.604 ( 0.603) Acc@1 48.44 ( 42.42) Acc@5 73.44 ( 66.33)
Test: [ 663/1563] Time 0.603 ( 0.603) Acc@1 50.00 ( 43.02) Acc@5 81.25 ( 66.81)
Test: [ 702/1563] Time 0.601 ( 0.603) Acc@1 68.75 ( 43.83) Acc@5 92.19 ( 67.44)
Test: [ 741/1563] Time 0.603 ( 0.603) Acc@1 53.12 ( 44.27) Acc@5 71.88 ( 67.79)
Test: [ 780/1563] Time 0.595 ( 0.603) Acc@1 42.19 ( 44.52) Acc@5 78.12 ( 68.10)
Test: [ 819/1563] Time 0.604 ( 0.603) Acc@1 85.94 ( 44.99) Acc@5 92.19 ( 68.58)
Test: [ 858/1563] Time 0.590 ( 0.603) Acc@1 71.88 ( 45.45) Acc@5 85.94 ( 68.98)
Test: [ 897/1563] Time 0.614 ( 0.603) Acc@1 81.25 ( 45.78) Acc@5 92.19 ( 69.30)
Test: [ 936/1563] Time 0.590 ( 0.603) Acc@1 46.88 ( 46.22) Acc@5 73.44 ( 69.67)
Test: [ 975/1563] Time 0.616 ( 0.603) Acc@1 35.94 ( 46.64) Acc@5 60.94 ( 69.98)
Test: [1014/1563] Time 0.590 ( 0.603) Acc@1 39.06 ( 46.98) Acc@5 89.06 ( 70.36)
Test: [1053/1563] Time 0.616 ( 0.603) Acc@1 64.06 ( 47.29) Acc@5 89.06 ( 70.79)
Test: [1092/1563] Time 0.604 ( 0.603) Acc@1 65.62 ( 47.56) Acc@5 85.94 ( 70.92)
Test: [1131/1563] Time 0.616 ( 0.603) Acc@1 42.19 ( 47.81) Acc@5 64.06 ( 71.14)
Test: [1170/1563] Time 0.603 ( 0.603) Acc@1 60.94 ( 48.11) Acc@5 93.75 ( 71.35)
Test: [1209/1563] Time 0.615 ( 0.603) Acc@1 46.88 ( 48.01) Acc@5 64.06 ( 71.32)
Test: [1248/1563] Time 0.603 ( 0.603) Acc@1 60.94 ( 48.24) Acc@5 89.06 ( 71.45)
Test: [1287/1563] Time 0.615 ( 0.603) Acc@1 54.69 ( 48.34) Acc@5 76.56 ( 71.54)
Test: [1326/1563] Time 0.603 ( 0.603) Acc@1 50.00 ( 48.62) Acc@5 68.75 ( 71.71)
Test: [1365/1563] Time 0.615 ( 0.603) Acc@1 26.56 ( 48.63) Acc@5 62.50 ( 71.81)
Test: [1404/1563] Time 0.603 ( 0.603) Acc@1 62.50 ( 48.86) Acc@5 92.19 ( 72.03)
Test: [1443/1563] Time 0.604 ( 0.603) Acc@1 62.50 ( 48.94) Acc@5 84.38 ( 72.25)
Test: [1482/1563] Time 0.602 ( 0.603) Acc@1 40.62 ( 49.13) Acc@5 65.62 ( 72.40)
Test: [1521/1563] Time 0.602 ( 0.603) Acc@1 53.12 ( 49.09) Acc@5 76.56 ( 72.47)
Test: [1560/1563] Time 0.602 ( 0.603) Acc@1 84.38 ( 49.26) Acc@5 87.50 ( 72.62)
2023-07-24 10:22:36,382 INFO : Result under shot_noise. The adaptation accuracy of SAR is top1: 49.24500 and top5: 72.61400
2023-07-24 10:22:36,383 INFO : acc1s are [49.68899917602539, 49.244998931884766]
2023-07-24 10:22:36,383 INFO : acc5s are [73.53099822998047, 72.61399841308594]
Test on impulse_noise level 5
2023-07-24 10:22:36,610 INFO : imbalance ratio is 500000
2023-07-24 10:22:36,610 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 10:22:37,962 INFO : Namespace(corruption='impulse_noise', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 10:22:37,964 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 0.710 ( 0.710) Acc@1 10.94 ( 10.94) Acc@5 25.00 ( 25.00)
Test: [ 39/1563] Time 0.591 ( 0.605) Acc@1 9.38 ( 23.01) Acc@5 34.38 ( 41.45)
Test: [ 78/1563] Time 0.615 ( 0.604) Acc@1 43.75 ( 24.31) Acc@5 67.19 ( 44.86)
Test: [ 117/1563] Time 0.602 ( 0.604) Acc@1 7.81 ( 30.02) Acc@5 45.31 ( 50.97)
Test: [ 156/1563] Time 0.603 ( 0.604) Acc@1 28.12 ( 34.07) Acc@5 62.50 ( 56.20)
Test: [ 195/1563] Time 0.603 ( 0.604) Acc@1 39.06 ( 35.27) Acc@5 68.75 ( 57.80)
Test: [ 234/1563] Time 0.604 ( 0.604) Acc@1 28.12 ( 37.69) Acc@5 64.06 ( 60.66)
Test: [ 273/1563] Time 0.604 ( 0.604) Acc@1 40.62 ( 37.28) Acc@5 71.88 ( 61.45)
Test: [ 312/1563] Time 0.604 ( 0.604) Acc@1 40.62 ( 38.35) Acc@5 57.81 ( 62.70)
Test: [ 351/1563] Time 0.604 ( 0.604) Acc@1 57.81 ( 39.26) Acc@5 75.00 ( 63.53)
Test: [ 390/1563] Time 0.603 ( 0.604) Acc@1 23.44 ( 39.81) Acc@5 40.62 ( 64.29)
Test: [ 429/1563] Time 0.603 ( 0.604) Acc@1 62.50 ( 40.72) Acc@5 82.81 ( 65.27)
Test: [ 468/1563] Time 0.603 ( 0.604) Acc@1 57.81 ( 41.44) Acc@5 81.25 ( 66.00)
Test: [ 507/1563] Time 0.603 ( 0.603) Acc@1 73.44 ( 42.09) Acc@5 90.62 ( 66.54)
Test: [ 546/1563] Time 0.603 ( 0.603) Acc@1 71.88 ( 43.21) Acc@5 82.81 ( 67.35)
Test: [ 585/1563] Time 0.604 ( 0.603) Acc@1 37.50 ( 43.74) Acc@5 64.06 ( 67.94)
Test: [ 624/1563] Time 0.602 ( 0.603) Acc@1 40.62 ( 44.15) Acc@5 75.00 ( 68.32)
Test: [ 663/1563] Time 0.604 ( 0.603) Acc@1 59.38 ( 44.70) Acc@5 75.00 ( 68.75)
Test: [ 702/1563] Time 0.603 ( 0.603) Acc@1 67.19 ( 45.45) Acc@5 73.44 ( 69.31)
Test: [ 741/1563] Time 0.603 ( 0.603) Acc@1 45.31 ( 45.97) Acc@5 75.00 ( 69.66)
Test: [ 780/1563] Time 0.603 ( 0.603) Acc@1 59.38 ( 46.07) Acc@5 85.94 ( 69.88)
Test: [ 819/1563] Time 0.603 ( 0.603) Acc@1 84.38 ( 46.36) Acc@5 96.88 ( 70.32)
Test: [ 858/1563] Time 0.604 ( 0.603) Acc@1 73.44 ( 46.71) Acc@5 90.62 ( 70.76)
Test: [ 897/1563] Time 0.604 ( 0.603) Acc@1 67.19 ( 47.04) Acc@5 92.19 ( 71.05)
Test: [ 936/1563] Time 0.604 ( 0.603) Acc@1 46.88 ( 47.37) Acc@5 73.44 ( 71.27)
Test: [ 975/1563] Time 0.603 ( 0.603) Acc@1 34.38 ( 47.76) Acc@5 70.31 ( 71.56)
Test: [1014/1563] Time 0.603 ( 0.603) Acc@1 51.56 ( 48.12) Acc@5 84.38 ( 71.91)
Test: [1053/1563] Time 0.603 ( 0.603) Acc@1 67.19 ( 48.48) Acc@5 92.19 ( 72.28)
Test: [1092/1563] Time 0.604 ( 0.603) Acc@1 68.75 ( 48.68) Acc@5 87.50 ( 72.40)
Test: [1131/1563] Time 0.603 ( 0.603) Acc@1 29.69 ( 48.90) Acc@5 60.94 ( 72.57)
Test: [1170/1563] Time 0.603 ( 0.603) Acc@1 68.75 ( 49.17) Acc@5 92.19 ( 72.77)
Test: [1209/1563] Time 0.602 ( 0.603) Acc@1 48.44 ( 49.04) Acc@5 73.44 ( 72.78)
Test: [1248/1563] Time 0.603 ( 0.603) Acc@1 62.50 ( 49.29) Acc@5 93.75 ( 72.96)
Test: [1287/1563] Time 0.604 ( 0.603) Acc@1 48.44 ( 49.38) Acc@5 78.12 ( 73.06)
Test: [1326/1563] Time 0.602 ( 0.603) Acc@1 59.38 ( 49.66) Acc@5 68.75 ( 73.29)
Test: [1365/1563] Time 0.601 ( 0.603) Acc@1 32.81 ( 49.67) Acc@5 60.94 ( 73.36)
Test: [1404/1563] Time 0.603 ( 0.603) Acc@1 65.62 ( 49.91) Acc@5 90.62 ( 73.56)
Test: [1443/1563] Time 0.604 ( 0.603) Acc@1 54.69 ( 49.97) Acc@5 82.81 ( 73.73)
Test: [1482/1563] Time 0.603 ( 0.603) Acc@1 43.75 ( 50.19) Acc@5 64.06 ( 73.88)
Test: [1521/1563] Time 0.604 ( 0.603) Acc@1 50.00 ( 50.13) Acc@5 70.31 ( 73.93)
Test: [1560/1563] Time 0.601 ( 0.603) Acc@1 87.50 ( 50.22) Acc@5 92.19 ( 74.07)
2023-07-24 10:38:20,838 INFO : Result under impulse_noise. The adaptation accuracy of SAR is top1: 50.20800 and top5: 74.06600
2023-07-24 10:38:20,839 INFO : acc1s are [49.68899917602539, 49.244998931884766, 50.20800018310547]
2023-07-24 10:38:20,839 INFO : acc5s are [73.53099822998047, 72.61399841308594, 74.06600189208984]
Test on defocus_blur level 5
2023-07-24 10:38:21,076 INFO : imbalance ratio is 500000
2023-07-24 10:38:21,076 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 10:38:22,426 INFO : Namespace(corruption='defocus_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 10:38:22,427 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 0.694 ( 0.694) Acc@1 26.56 ( 26.56) Acc@5 48.44 ( 48.44)
Test: [ 39/1563] Time 0.603 ( 0.604) Acc@1 9.38 ( 32.62) Acc@5 21.88 ( 54.14)
Test: [ 78/1563] Time 0.604 ( 0.604) Acc@1 42.19 ( 35.21) Acc@5 81.25 ( 58.72)
Test: [ 117/1563] Time 0.604 ( 0.604) Acc@1 46.88 ( 39.88) Acc@5 85.94 ( 63.52)
Test: [ 156/1563] Time 0.603 ( 0.604) Acc@1 26.56 ( 41.85) Acc@5 43.75 ( 66.61)
Test: [ 195/1563] Time 0.603 ( 0.604) Acc@1 34.38 ( 42.88) Acc@5 60.94 ( 67.59)
Test: [ 234/1563] Time 0.603 ( 0.604) Acc@1 68.75 ( 45.61) Acc@5 87.50 ( 69.99)
Test: [ 273/1563] Time 0.604 ( 0.604) Acc@1 53.12 ( 45.79) Acc@5 76.56 ( 70.60)
Test: [ 312/1563] Time 0.603 ( 0.604) Acc@1 34.38 ( 46.54) Acc@5 65.62 ( 71.30)
Test: [ 351/1563] Time 0.602 ( 0.604) Acc@1 64.06 ( 47.30) Acc@5 78.12 ( 71.94)
Test: [ 390/1563] Time 0.602 ( 0.603) Acc@1 39.06 ( 48.14) Acc@5 67.19 ( 72.95)
Test: [ 429/1563] Time 0.602 ( 0.603) Acc@1 67.19 ( 49.11) Acc@5 87.50 ( 73.66)
Test: [ 468/1563] Time 0.605 ( 0.603) Acc@1 60.94 ( 49.66) Acc@5 75.00 ( 74.06)
Test: [ 507/1563] Time 0.604 ( 0.603) Acc@1 82.81 ( 50.22) Acc@5 90.62 ( 74.33)
Test: [ 546/1563] Time 0.602 ( 0.603) Acc@1 59.38 ( 50.81) Acc@5 87.50 ( 74.76)
Test: [ 585/1563] Time 0.602 ( 0.603) Acc@1 31.25 ( 50.91) Acc@5 70.31 ( 74.97)
Test: [ 624/1563] Time 0.602 ( 0.603) Acc@1 57.81 ( 51.21) Acc@5 73.44 ( 75.32)
Test: [ 663/1563] Time 0.603 ( 0.603) Acc@1 48.44 ( 51.76) Acc@5 82.81 ( 75.48)
Test: [ 702/1563] Time 0.603 ( 0.603) Acc@1 65.62 ( 52.36) Acc@5 79.69 ( 76.03)
Test: [ 741/1563] Time 0.604 ( 0.603) Acc@1 34.38 ( 52.64) Acc@5 60.94 ( 76.19)
Test: [ 780/1563] Time 0.604 ( 0.603) Acc@1 78.12 ( 52.85) Acc@5 95.31 ( 76.37)
Test: [ 819/1563] Time 0.602 ( 0.603) Acc@1 79.69 ( 52.96) Acc@5 87.50 ( 76.45)
Test: [ 858/1563] Time 0.603 ( 0.603) Acc@1 75.00 ( 53.27) Acc@5 93.75 ( 76.70)
Test: [ 897/1563] Time 0.604 ( 0.603) Acc@1 67.19 ( 53.47) Acc@5 79.69 ( 76.86)
Test: [ 936/1563] Time 0.603 ( 0.603) Acc@1 57.81 ( 53.77) Acc@5 85.94 ( 77.11)
Test: [ 975/1563] Time 0.604 ( 0.603) Acc@1 40.62 ( 53.95) Acc@5 64.06 ( 77.32)
Test: [1014/1563] Time 0.603 ( 0.603) Acc@1 71.88 ( 54.09) Acc@5 96.88 ( 77.52)
Test: [1053/1563] Time 0.601 ( 0.603) Acc@1 70.31 ( 54.39) Acc@5 87.50 ( 77.86)
Test: [1092/1563] Time 0.604 ( 0.603) Acc@1 73.44 ( 54.48) Acc@5 82.81 ( 77.82)
Test: [1131/1563] Time 0.603 ( 0.603) Acc@1 53.12 ( 54.59) Acc@5 84.38 ( 77.94)
Test: [1170/1563] Time 0.603 ( 0.603) Acc@1 71.88 ( 54.77) Acc@5 92.19 ( 78.02)
Test: [1209/1563] Time 0.603 ( 0.603) Acc@1 26.56 ( 54.64) Acc@5 68.75 ( 78.06)
Test: [1248/1563] Time 0.604 ( 0.603) Acc@1 65.62 ( 54.64) Acc@5 87.50 ( 78.02)
Test: [1287/1563] Time 0.603 ( 0.603) Acc@1 60.94 ( 54.66) Acc@5 81.25 ( 78.05)
Test: [1326/1563] Time 0.604 ( 0.603) Acc@1 60.94 ( 54.83) Acc@5 75.00 ( 78.17)
Test: [1365/1563] Time 0.601 ( 0.603) Acc@1 37.50 ( 54.86) Acc@5 65.62 ( 78.20)
Test: [1404/1563] Time 0.604 ( 0.603) Acc@1 76.56 ( 55.06) Acc@5 95.31 ( 78.34)
Test: [1443/1563] Time 0.604 ( 0.603) Acc@1 51.56 ( 55.11) Acc@5 84.38 ( 78.44)
Test: [1482/1563] Time 0.602 ( 0.603) Acc@1 29.69 ( 55.22) Acc@5 56.25 ( 78.56)
Test: [1521/1563] Time 0.603 ( 0.603) Acc@1 62.50 ( 55.29) Acc@5 85.94 ( 78.62)
Test: [1560/1563] Time 0.615 ( 0.603) Acc@1 81.25 ( 55.42) Acc@5 82.81 ( 78.75)
2023-07-24 10:54:05,063 INFO : Result under defocus_blur. The adaptation accuracy of SAR is top1: 55.40400 and top5: 78.73900
2023-07-24 10:54:05,064 INFO : acc1s are [49.68899917602539, 49.244998931884766, 50.20800018310547, 55.40399932861328]
2023-07-24 10:54:05,064 INFO : acc5s are [73.53099822998047, 72.61399841308594, 74.06600189208984, 78.73899841308594]
Test on glass_blur level 5
2023-07-24 10:54:05,232 INFO : imbalance ratio is 500000
2023-07-24 10:54:05,232 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 10:54:06,583 INFO : Namespace(corruption='glass_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 10:54:06,585 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 0.693 ( 0.693) Acc@1 18.75 ( 18.75) Acc@5 29.69 ( 29.69)
Test: [ 39/1563] Time 0.604 ( 0.606) Acc@1 3.12 ( 29.10) Acc@5 6.25 ( 46.45)
Test: [ 78/1563] Time 0.603 ( 0.605) Acc@1 34.38 ( 29.31) Acc@5 68.75 ( 49.78)
Test: [ 117/1563] Time 0.616 ( 0.604) Acc@1 65.62 ( 34.38) Acc@5 85.94 ( 55.67)
Test: [ 156/1563] Time 0.603 ( 0.604) Acc@1 18.75 ( 36.92) Acc@5 40.62 ( 59.23)
Test: [ 195/1563] Time 0.615 ( 0.604) Acc@1 25.00 ( 37.49) Acc@5 64.06 ( 60.39)
Test: [ 234/1563] Time 0.604 ( 0.604) Acc@1 43.75 ( 39.93) Acc@5 71.88 ( 63.13)
Test: [ 273/1563] Time 0.616 ( 0.604) Acc@1 54.69 ( 40.33) Acc@5 78.12 ( 64.10)
Test: [ 312/1563] Time 0.603 ( 0.604) Acc@1 35.94 ( 41.22) Acc@5 54.69 ( 65.30)
Test: [ 351/1563] Time 0.609 ( 0.604) Acc@1 50.00 ( 42.52) Acc@5 67.19 ( 66.28)
Test: [ 390/1563] Time 0.603 ( 0.604) Acc@1 28.12 ( 43.35) Acc@5 60.94 ( 67.51)
Test: [ 429/1563] Time 0.604 ( 0.603) Acc@1 56.25 ( 44.55) Acc@5 87.50 ( 68.46)
Test: [ 468/1563] Time 0.611 ( 0.604) Acc@1 59.38 ( 45.24) Acc@5 76.56 ( 69.13)
Test: [ 507/1563] Time 0.604 ( 0.604) Acc@1 82.81 ( 46.00) Acc@5 98.44 ( 69.70)
Test: [ 546/1563] Time 0.615 ( 0.604) Acc@1 51.56 ( 46.84) Acc@5 75.00 ( 70.35)
Test: [ 585/1563] Time 0.591 ( 0.603) Acc@1 42.19 ( 47.25) Acc@5 73.44 ( 70.85)
Test: [ 624/1563] Time 0.615 ( 0.603) Acc@1 50.00 ( 47.97) Acc@5 68.75 ( 71.49)
Test: [ 663/1563] Time 0.590 ( 0.603) Acc@1 53.12 ( 48.82) Acc@5 87.50 ( 72.00)
Test: [ 702/1563] Time 0.615 ( 0.603) Acc@1 67.19 ( 49.72) Acc@5 92.19 ( 72.70)
Test: [ 741/1563] Time 0.604 ( 0.603) Acc@1 39.06 ( 50.20) Acc@5 70.31 ( 73.09)
Test: [ 780/1563] Time 0.616 ( 0.603) Acc@1 67.19 ( 50.51) Acc@5 90.62 ( 73.44)
Test: [ 819/1563] Time 0.602 ( 0.603) Acc@1 82.81 ( 50.72) Acc@5 90.62 ( 73.73)
Test: [ 858/1563] Time 0.603 ( 0.603) Acc@1 68.75 ( 51.04) Acc@5 87.50 ( 74.04)
Test: [ 897/1563] Time 0.603 ( 0.603) Acc@1 71.88 ( 51.42) Acc@5 85.94 ( 74.36)
Test: [ 936/1563] Time 0.603 ( 0.603) Acc@1 57.81 ( 51.69) Acc@5 76.56 ( 74.61)
Test: [ 975/1563] Time 0.590 ( 0.603) Acc@1 56.25 ( 51.97) Acc@5 76.56 ( 74.91)
Test: [1014/1563] Time 0.604 ( 0.603) Acc@1 54.69 ( 52.18) Acc@5 82.81 ( 75.27)
Test: [1053/1563] Time 0.590 ( 0.603) Acc@1 82.81 ( 52.66) Acc@5 95.31 ( 75.68)
Test: [1092/1563] Time 0.604 ( 0.603) Acc@1 75.00 ( 52.79) Acc@5 89.06 ( 75.75)
Test: [1131/1563] Time 0.589 ( 0.603) Acc@1 51.56 ( 53.05) Acc@5 68.75 ( 75.97)
Test: [1170/1563] Time 0.590 ( 0.603) Acc@1 65.62 ( 53.33) Acc@5 90.62 ( 76.15)
Test: [1209/1563] Time 0.603 ( 0.603) Acc@1 28.12 ( 53.23) Acc@5 62.50 ( 76.20)
Test: [1248/1563] Time 0.590 ( 0.603) Acc@1 67.19 ( 53.41) Acc@5 84.38 ( 76.25)
Test: [1287/1563] Time 0.603 ( 0.603) Acc@1 45.31 ( 53.47) Acc@5 71.88 ( 76.35)
Test: [1326/1563] Time 0.591 ( 0.603) Acc@1 70.31 ( 53.70) Acc@5 79.69 ( 76.50)
Test: [1365/1563] Time 0.601 ( 0.603) Acc@1 31.25 ( 53.73) Acc@5 67.19 ( 76.62)
Test: [1404/1563] Time 0.601 ( 0.603) Acc@1 68.75 ( 53.85) Acc@5 95.31 ( 76.78)
Test: [1443/1563] Time 0.603 ( 0.603) Acc@1 51.56 ( 53.86) Acc@5 82.81 ( 76.89)
Test: [1482/1563] Time 0.603 ( 0.603) Acc@1 50.00 ( 54.04) Acc@5 79.69 ( 77.04)
Test: [1521/1563] Time 0.604 ( 0.603) Acc@1 51.56 ( 54.06) Acc@5 85.94 ( 77.13)
Test: [1560/1563] Time 0.601 ( 0.603) Acc@1 89.06 ( 54.23) Acc@5 92.19 ( 77.30)
2023-07-24 11:09:49,176 INFO : Result under glass_blur. The adaptation accuracy of SAR is top1: 54.21800 and top5: 77.30199
2023-07-24 11:09:49,176 INFO : acc1s are [49.68899917602539, 49.244998931884766, 50.20800018310547, 55.40399932861328, 54.21799850463867]
2023-07-24 11:09:49,176 INFO : acc5s are [73.53099822998047, 72.61399841308594, 74.06600189208984, 78.73899841308594, 77.30199432373047]
Test on motion_blur level 5
2023-07-24 11:09:49,411 INFO : imbalance ratio is 500000
2023-07-24 11:09:49,411 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 11:09:50,764 INFO : Namespace(corruption='motion_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 11:09:50,766 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 0.695 ( 0.695) Acc@1 21.88 ( 21.88) Acc@5 39.06 ( 39.06)
Test: [ 39/1563] Time 0.615 ( 0.604) Acc@1 25.00 ( 38.83) Acc@5 51.56 ( 59.45)
Test: [ 78/1563] Time 0.601 ( 0.602) Acc@1 51.56 ( 38.43) Acc@5 73.44 ( 61.27)
Test: [ 117/1563] Time 0.616 ( 0.602) Acc@1 37.50 ( 43.10) Acc@5 67.19 ( 65.44)
Test: [ 156/1563] Time 0.603 ( 0.603) Acc@1 23.44 ( 46.24) Acc@5 64.06 ( 68.87)
Test: [ 195/1563] Time 0.616 ( 0.603) Acc@1 35.94 ( 46.28) Acc@5 78.12 ( 69.49)
Test: [ 234/1563] Time 0.604 ( 0.603) Acc@1 59.38 ( 49.26) Acc@5 85.94 ( 72.13)
Test: [ 273/1563] Time 0.615 ( 0.603) Acc@1 62.50 ( 49.09) Acc@5 76.56 ( 72.46)
Test: [ 312/1563] Time 0.601 ( 0.603) Acc@1 35.94 ( 49.70) Acc@5 64.06 ( 73.26)
Test: [ 351/1563] Time 0.615 ( 0.603) Acc@1 67.19 ( 50.32) Acc@5 78.12 ( 73.85)
Test: [ 390/1563] Time 0.605 ( 0.603) Acc@1 42.19 ( 51.20) Acc@5 64.06 ( 74.69)
Test: [ 429/1563] Time 0.604 ( 0.603) Acc@1 59.38 ( 52.05) Acc@5 92.19 ( 75.50)
Test: [ 468/1563] Time 0.603 ( 0.603) Acc@1 70.31 ( 52.72) Acc@5 84.38 ( 75.93)
Test: [ 507/1563] Time 0.614 ( 0.603) Acc@1 82.81 ( 53.24) Acc@5 90.62 ( 76.33)
Test: [ 546/1563] Time 0.589 ( 0.602) Acc@1 62.50 ( 54.04) Acc@5 90.62 ( 76.81)
Test: [ 585/1563] Time 0.614 ( 0.602) Acc@1 40.62 ( 54.13) Acc@5 68.75 ( 76.92)
Test: [ 624/1563] Time 0.590 ( 0.602) Acc@1 42.19 ( 54.54) Acc@5 64.06 ( 77.20)
Test: [ 663/1563] Time 0.603 ( 0.602) Acc@1 75.00 ( 55.14) Acc@5 92.19 ( 77.48)
Test: [ 702/1563] Time 0.590 ( 0.602) Acc@1 68.75 ( 55.76) Acc@5 87.50 ( 77.94)
Test: [ 741/1563] Time 0.616 ( 0.602) Acc@1 34.38 ( 56.09) Acc@5 73.44 ( 78.30)
Test: [ 780/1563] Time 0.603 ( 0.603) Acc@1 79.69 ( 56.17) Acc@5 100.00 ( 78.52)
Test: [ 819/1563] Time 0.614 ( 0.603) Acc@1 75.00 ( 56.36) Acc@5 98.44 ( 78.79)
Test: [ 858/1563] Time 0.603 ( 0.603) Acc@1 76.56 ( 56.56) Acc@5 89.06 ( 79.04)
Test: [ 897/1563] Time 0.616 ( 0.603) Acc@1 60.94 ( 56.87) Acc@5 90.62 ( 79.25)
Test: [ 936/1563] Time 0.604 ( 0.603) Acc@1 45.31 ( 57.21) Acc@5 81.25 ( 79.48)
Test: [ 975/1563] Time 0.615 ( 0.603) Acc@1 56.25 ( 57.44) Acc@5 67.19 ( 79.75)
Test: [1014/1563] Time 0.603 ( 0.603) Acc@1 76.56 ( 57.47) Acc@5 98.44 ( 79.89)
Test: [1053/1563] Time 0.615 ( 0.602) Acc@1 78.12 ( 57.74) Acc@5 87.50 ( 80.19)
Test: [1092/1563] Time 0.603 ( 0.602) Acc@1 71.88 ( 57.83) Acc@5 84.38 ( 80.23)
Test: [1131/1563] Time 0.615 ( 0.602) Acc@1 62.50 ( 58.02) Acc@5 84.38 ( 80.31)
Test: [1170/1563] Time 0.595 ( 0.603) Acc@1 65.62 ( 58.20) Acc@5 90.62 ( 80.44)
Test: [1209/1563] Time 0.604 ( 0.603) Acc@1 43.75 ( 58.11) Acc@5 75.00 ( 80.47)
Test: [1248/1563] Time 0.589 ( 0.603) Acc@1 68.75 ( 58.20) Acc@5 84.38 ( 80.49)
Test: [1287/1563] Time 0.614 ( 0.602) Acc@1 54.69 ( 58.27) Acc@5 82.81 ( 80.54)
Test: [1326/1563] Time 0.591 ( 0.602) Acc@1 64.06 ( 58.42) Acc@5 84.38 ( 80.62)
Test: [1365/1563] Time 0.616 ( 0.602) Acc@1 35.94 ( 58.42) Acc@5 75.00 ( 80.70)
Test: [1404/1563] Time 0.590 ( 0.602) Acc@1 62.50 ( 58.56) Acc@5 90.62 ( 80.84)
Test: [1443/1563] Time 0.603 ( 0.603) Acc@1 51.56 ( 58.64) Acc@5 89.06 ( 80.97)
Test: [1482/1563] Time 0.590 ( 0.602) Acc@1 53.12 ( 58.83) Acc@5 68.75 ( 81.11)
Test: [1521/1563] Time 0.615 ( 0.602) Acc@1 48.44 ( 58.81) Acc@5 76.56 ( 81.13)
Test: [1560/1563] Time 0.603 ( 0.603) Acc@1 90.62 ( 58.96) Acc@5 93.75 ( 81.29)
2023-07-24 11:25:32,345 INFO : Result under motion_blur. The adaptation accuracy of SAR is top1: 58.93800 and top5: 81.27800
2023-07-24 11:25:32,345 INFO : acc1s are [49.68899917602539, 49.244998931884766, 50.20800018310547, 55.40399932861328, 54.21799850463867, 58.9379997253418]
2023-07-24 11:25:32,345 INFO : acc5s are [73.53099822998047, 72.61399841308594, 74.06600189208984, 78.73899841308594, 77.30199432373047, 81.27799987792969]
Test on zoom_blur level 5
2023-07-24 11:25:32,522 INFO : imbalance ratio is 500000
2023-07-24 11:25:32,522 INFO : label_shifts_indices_path is ./dataset/total_100000_ir_500000_class_order_shuffle_yes.npy
2023-07-24 11:25:33,867 INFO : Namespace(corruption='zoom_blur', d_margin=0.05, data='/dockerdata/imagenet', data_corruption='/mnt/inspurfs/dataset/imagenet-c', debug=False, e_margin=2.763102111592855, exp_type='label_shifts', fisher_alpha=2000.0, fisher_size=2000, gpu=0, if_shuffle=False, imbalance_ratio=500000, level=5, logger_name='2023-07-24-09-51-03-sar-vitbase_timm-level5-seed2021.txt', lr=0.001, method='sar', model='vitbase_timm', output='./outputs/sar', print_freq=39, sar_margin_e0=2.763102111592855, seed=2021, test_batch_size=64, workers=2)
2023-07-24 11:25:33,869 INFO : ['blocks.0.norm1.weight', 'blocks.0.norm1.bias', 'blocks.0.norm2.weight', 'blocks.0.norm2.bias', 'blocks.1.norm1.weight', 'blocks.1.norm1.bias', 'blocks.1.norm2.weight', 'blocks.1.norm2.bias', 'blocks.2.norm1.weight', 'blocks.2.norm1.bias', 'blocks.2.norm2.weight', 'blocks.2.norm2.bias', 'blocks.3.norm1.weight', 'blocks.3.norm1.bias', 'blocks.3.norm2.weight', 'blocks.3.norm2.bias', 'blocks.4.norm1.weight', 'blocks.4.norm1.bias', 'blocks.4.norm2.weight', 'blocks.4.norm2.bias', 'blocks.5.norm1.weight', 'blocks.5.norm1.bias', 'blocks.5.norm2.weight', 'blocks.5.norm2.bias', 'blocks.6.norm1.weight', 'blocks.6.norm1.bias', 'blocks.6.norm2.weight', 'blocks.6.norm2.bias', 'blocks.7.norm1.weight', 'blocks.7.norm1.bias', 'blocks.7.norm2.weight', 'blocks.7.norm2.bias', 'blocks.8.norm1.weight', 'blocks.8.norm1.bias', 'blocks.8.norm2.weight', 'blocks.8.norm2.bias']
Test: [ 0/1563] Time 0.660 ( 0.660) Acc@1 17.19 ( 17.19) Acc@5 21.88 ( 21.88)
Test: [ 39/1563] Time 0.604 ( 0.605) Acc@1 20.31 ( 33.24) Acc@5 53.12 ( 52.27)
Test: [ 78/1563] Time 0.603 ( 0.604) Acc@1 32.81 ( 35.27) Acc@5 62.50 ( 56.25)
Test: [ 117/1563] Time 0.602 ( 0.604) Acc@1 34.38 ( 38.10) Acc@5 67.19 ( 60.28)
Test: [ 156/1563] Time 0.603 ( 0.603) Acc@1 54.69 ( 39.92) Acc@5 82.81 ( 63.48)
Test: [ 195/1563] Time 0.602 ( 0.603) Acc@1 12.50 ( 40.00) Acc@5 46.88 ( 64.11)
Test: [ 234/1563] Time 0.604 ( 0.603) Acc@1 64.06 ( 43.55) Acc@5 85.94 ( 67.03)
Test: [ 273/1563] Time 0.604 ( 0.603) Acc@1 59.38 ( 43.35) Acc@5 79.69 ( 67.31)
Test: [ 312/1563] Time 0.602 ( 0.603) Acc@1 45.31 ( 44.61) Acc@5 68.75 ( 68.54)
Test: [ 351/1563] Time 0.602 ( 0.603) Acc@1 54.69 ( 45.41) Acc@5 76.56 ( 69.14)
Test: [ 390/1563] Time 0.602 ( 0.603) Acc@1 37.50 ( 46.48) Acc@5 67.19 ( 70.25)
Test: [ 429/1563] Time 0.602 ( 0.603) Acc@1 60.94 ( 47.24) Acc@5 93.75 ( 71.34)
Test: [ 468/1563] Time 0.603 ( 0.603) Acc@1 53.12 ( 47.76) Acc@5 79.69 ( 71.98)
Test: [ 507/1563] Time 0.603 ( 0.603) Acc@1 82.81 ( 48.33) Acc@5 96.88 ( 72.32)
Test: [ 546/1563] Time 0.603 ( 0.603) Acc@1 73.44 ( 49.22) Acc@5 93.75 ( 73.01)
Test: [ 585/1563] Time 0.590 ( 0.603) Acc@1 29.69 ( 49.43) Acc@5 65.62 ( 73.39)
Test: [ 624/1563] Time 0.601 ( 0.603) Acc@1 51.56 ( 49.87) Acc@5 82.81 ( 73.67)
Test: [ 663/1563] Time 0.603 ( 0.603) Acc@1 60.94 ( 50.54) Acc@5 93.75 ( 74.00)
Test: [ 702/1563] Time 0.602 ( 0.603) Acc@1 73.44 ( 51.06) Acc@5 84.38 ( 74.35)
Test: [ 741/1563] Time 0.602 ( 0.603) Acc@1 40.62 ( 51.36) Acc@5 81.25 ( 74.52)
Test: [ 780/1563] Time 0.603 ( 0.603) Acc@1 75.00 ( 51.53) Acc@5 93.75 ( 74.69)
Test: [ 819/1563] Time 0.602 ( 0.603) Acc@1 75.00 ( 51.84) Acc@5 89.06 ( 75.16)
Test: [ 858/1563] Time 0.603 ( 0.603) Acc@1 78.12 ( 52.11) Acc@5 90.62 ( 75.41)
Test: [ 897/1563] Time 0.603 ( 0.603) Acc@1 65.62 ( 52.56) Acc@5 92.19 ( 75.83)
Test: [ 936/1563] Time 0.602 ( 0.603) Acc@1 43.75 ( 52.87) Acc@5 73.44 ( 76.11)
Test: [ 975/1563] Time 0.603 ( 0.603) Acc@1 46.88 ( 53.05) Acc@5 56.25 ( 76.39)
Test: [1014/1563] Time 0.604 ( 0.603) Acc@1 70.31 ( 53.23) Acc@5 93.75 ( 76.60)
Test: [1053/1563] Time 0.604 ( 0.603) Acc@1 75.00 ( 53.65) Acc@5 89.06 ( 76.98)
Test: [1092/1563] Time 0.604 ( 0.603) Acc@1 67.19 ( 53.82) Acc@5 79.69 ( 77.10)
Test: [1131/1563] Time 0.603 ( 0.603) Acc@1 62.50 ( 53.99) Acc@5 76.56 ( 77.26)
Test: [1170/1563] Time 0.604 ( 0.603) Acc@1 57.81 ( 54.31) Acc@5 82.81 ( 77.49)
Test: [1209/1563] Time 0.603 ( 0.603) Acc@1 40.62 ( 54.22) Acc@5 68.75 ( 77.61)
Test: [1248/1563] Time 0.604 ( 0.603) Acc@1 64.06 ( 54.33) Acc@5 79.69 ( 77.62)
Test: [1287/1563] Time 0.604 ( 0.603) Acc@1 76.56 ( 54.38) Acc@5 92.19 ( 77.70)
Test: [1326/1563] Time 0.601 ( 0.603) Acc@1 56.25 ( 54.65) Acc@5 73.44 ( 77.86)
Test: [1365/1563] Time 0.603 ( 0.603) Acc@1 17.19 ( 54.62) Acc@5 59.38 ( 77.92)
Test: [1404/1563] Time 0.603 ( 0.603) Acc@1 68.75 ( 54.78) Acc@5 93.75 ( 78.03)
Test: [1443/1563] Time 0.603 ( 0.603) Acc@1 43.75 ( 54.79) Acc@5 79.69 ( 78.17)
Test: [1482/1563] Time 0.602 ( 0.603) Acc@1 37.50 ( 54.99) Acc@5 64.06 ( 78.31)
Test: [1521/1563] Time 0.615 ( 0.603) Acc@1 64.06 ( 55.06) Acc@5 92.19 ( 78.38)
Test: [1560/1563] Time 0.603 ( 0.603) Acc@1 87.50 ( 55.25) Acc@5 90.62 ( 78.56)
2023-07-24 11:41:16,210 INFO : Result under zoom_blur. The adaptation accuracy of SAR is top1: 55.22300 and top5: 78.54600
2023-07-24 11:41:16,210 INFO : acc1s are [49.68899917602539, 49.244998931884766, 50.20800018310547, 55.40399932861328, 54.21799850463867, 58.9379997253418, 55.222999572753906]
2023-07-24 11:41:16,210 INFO : acc5s are [73.53099822998047, 72.61399841308594, 74.06600189208984, 78.73899841308594, 77.30199432373047, 81.27799987792969, 78.5459976196289]

Thanks for your reply. I found the problem. I set the episodic=True in my code for test and forgot it.