Finetune on a new sample
lelechen63 opened this issue · comments
Lele Chen commented
Thanks for the code and excilent work. I have one question about the few-shot step. After training, do we need to finetune the AdaIN layers on new samples? I found you just calculate the latent code based on new samples and test it without any finetuning. Why we say this is few-shot? Thanks in advance!