MCG-NJU / EMA-VFI

[CVPR 2023] Extracting Motion and Appearance via Inter-Frame Attention for Efficient Video Frame Interpolatio

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Can you provide arbitrary frame interpolation training code

JRStudio-107 opened this issue · comments

The existing version only includes the fixed step fill frame training process, I want to learn the process of arbitrary frame interpolation, thx

Thanks for your concern! We won't provide the training code that supports arbitrary temporal interpolation in order to reduce the complexity of the code base. But it should be noticed that the configuration of the training dataset is the only thing that needs to be changed (we follow the setting of RIFE); the rest of the training process is essentially the same for both fixed- and arbitrary-training.

Thank you for your reply. In fact, while waiting for your reply, I have built arbitrary temporal interpolation training methods, also referring to RIFE's Settings. However, during the training process, my training loss is almost unchanged, and the performance of PSNR test after each epoch becomes worse and worse with the training, the interpolation frame calculated by the trained model is all black, have you ever encountered similar cases?

No, the training process was very stable. You can see if the time step is computed correctly and passed into the model.

Hello, @JRStudio-107 and @GuozhenZhang1999 I am also interested in exploring the training of arbitrary temporal interpolation. Could you plz release the relevant code in another branch or repo?

Hi I wanted to ask if the arbitrary timestep training code can be shared? How can I reproduce the results??