asdf0982 / vqa-mfb.pytorch

This project is out of date, I don't remember the details inside...

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question about MFB Baseline

adeelz92 opened this issue · comments

This is a confirmation question about MFB Baseline. According to the paper there should be two layers of LSTM with 1024-D hidden units each, but in the implementation only one LSTM layer with 1024-D is used. Kindly confirm.
Also 2 layer lstm means stacking two lstm layers on top of each other ?
Thanks

@adeelz92 Yes, I remember that two layers of lstm and one layer of lstm have the same effect, so i use one layer. You can also do experiments to compare.