wrong code.
LemonQC opened this issue · comments
LemonQC commented
Jiayuan Mao commented
If you look at the forward function, they will be applied in different ways (recurrent vs. a single forward step).
LemonQC commented
If you look at the forward function, they will be applied in different ways (recurrent vs. a single forward step).
Ok, I also found. I think you just utilize the recurrent approach rather that RNN?
Jiayuan Mao commented
RNN basically stands for applying the same neural module (typically named RNNCell in PyTorch) recurrently. So calling it RNN is fine. And at the end this is just a naming for a variable…