gidariss / FewShotWithoutForgetting

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

A question about the training process

HX-idiot opened this issue · comments

Hello brother, it's really a good job. But what confused me is that in training step 2, when we need to train a weight generator, you keep on training the weight_base, it seems weight_base has already trained well in step 1(pretrian step) , so is there any special reason for this operators? how can we ensure the compatibility between weight_base and generated parameters , and the compatibility between generated parameters?
Also, can such method be used in situations when N is very large(N-way K-shot)? In extreme cases,maybe N is larger than the number of weight_base, If possible, I hope you can give me some suggestions.
Thank you~