yaoyao-liu / meta-transfer-learning

TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)

Home Page:https://lyy.mpi-inf.mpg.de/mtl/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

some questions about meta-training and meta-testing phase

Sword-keeper opened this issue · comments

It's really a nice code ! However,I still have some questions about the meta training phase.
1.In the main.py, I found that in the meta-training phase, "-num_batch" is 100. It means there are 100 meta-training tasks in the meta-training phase. It was chosen from 64 classes in the 'train' folder. Each task contains 5 classes .In the support set, it is 51 samples. In the query set, it contains 515 samples.
Besides,in the meta-eval phase,it is chosen from 20 classes in the 'test' folder. Each task contains 5 classes .In the support set, it is 51 samples. In the query set, it contains 515 samples.
Am I right?
2.I can't understand the meaning of "--max_epoch"(line 29). Is it just like normal DL? For example,
a: epoch 1: task 1100, epoch 2: task 101200,...... epoch 100: task 10011100
b: epoch 1: task 1
100, epoch 2: task 1100.............epoch 100: task 1100
which one is right? Maybe they are all wrong.

oh it maybe something wrong about my issue. Let me comment is again.

It's really a nice code ! However,I still have some questions about the meta training phase.
1.In the main.py, I found that in the meta-training phase, "-num_batch" is 100. It means there are 100 meta-training tasks in the meta-training phase. It was chosen from 64 classes in the 'train' folder. Each task contains 5 classes .In the support set, it is 51 samples. In the query set, it contains 515 samples.
Besides,in the meta-eval phase,it is chosen from 20 classes in the 'test' folder. Each task contains 5 classes .In the support set, it is 51 samples. In the query set, it contains 515 samples.
Am I right?
2.I can't understand the meaning of "--max_epoch"(line 29). Is it just like normal DL? For example,
a: epoch 1: task 1--100, epoch 2: task 101--200,...... epoch 100: task 1001--1100
b: epoch 1: task 1--100, epoch 2: task 1--100.............epoch 100: task 1--100
which one is right? Maybe they are all wrong.

Hi @Sword-keeper,

Thanks for your interest in our work.
The PyTorch implementation is built based on the open-source code for FEAT. So we follow its settings.

num_batch=100 means we randomly sampled 100 tasks from the meta-train set. Each task contains 5 support samples and 75 query samples for 1-shot, 5-class setting. max_epoch=100 means we optimize the meta model on these 100 tasks for 100 times.

If you have any further questions, feel free to add more comments.

  oh there still something wrong with my second comment, "51 samples " means 5×1 samples , “515samples” means 5×15 samples. So , I think we have same opinion about "num_batch".
  As for "num-batch=100", as far as i know, when number of the dataset's sample is low, if our epoch is too high, the model is very easy to overfitting.  As for this "5-way 1-shot" classification task, why we set the "max_epoch=100"?

Hi @Sword-keeper,

As I have explained, this setting follows FEAT. I have tried to use 10000 tasks and meta-train the model for 1 epoch. The performance is similar to using 100 tasks for 100 epochs.

Besides, if data augmentation is applied, the overfitting problem won't be serious during meta training.