sungnyun / openssl-simcore

(CVPR 2023) Coreset Sampling from Open-Set for Fine-Grained Self-Supervised Learning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About the accuracy of fine-tuning on the target fine-grained datasetes

eafn opened this issue · comments

commented

Hi, I have read your paper which is a great work. But I have a question, how is the performance of fine-tuning the entire ResNet50 (both encoder and classifier) on these target fine-grained datasetes with supervised learning? Do you have the result?

Hi and thanks for your good question!

Yes, in Table 7b, we summarized end-to-end supervised fine-tuning results with SSL pretrained ResNet50 on four fine-grained datasets.
But, we only reported not fully-labeled scenarios as in previous SSL works.

Here, we additionally summarize e2e fine-tuning results on fully-labeled (100%) target datasets:

Pretrain Aircraft Cars Pet Birds
X 79.89 88.63 78.24 67.46
OS 81.28 87.43 85.01 70.12
SimCore 83.28 89.54 85.95 71.24

cf) Note that we fine-tuned models for 100 epochs with a momentum SGD optimizer and weight decay of 1e-4.
We searched the optimal learning rate among three logarithmically spaced values from 1e-1 to 1e-2 (i.e., {1e-1, 3e-2, 1e-2}), and they are decayed after 60 and 80 epochs by the ratio of 0.1.