Maclory / Deep-Iterative-Collaboration

Pytorch implementation of Deep Face Super-Resolution with Iterative Collaboration between Attentive Recovery and Landmark Estimation (CVPR 2020)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About the pretrained HG model

yukichou opened this issue · comments

Is it possible to change the number of key points for training DIC without adding any extra works?
I want to run some cross-domain experiments with the network you proposed, however, the number of key points(landmark) of my dataset is less than 68 which is 20 and therefore I can't use the HG pre-train model which is pretrained in 68 key points. Can I just not use the HG pre-train model to train DIC or I need to train the HG pre-train model from scratch with my own key points? Thanks for your reply.

I suggest training the pre-train model from scratch with your own key points. Training with pre-trained HG would be more stable. For the HG training, you can refer to the repo.