Bug: dropout is not deactivated during evaluation
andreimargeloiu opened this issue · comments
Problem: Calling model.score()
function repeatedly gives difference results. I suspect that's because LassoNet doesn't call model.eval()
to stop the stochastic components (e.g., dropout).
Solution: Call model.eval()
inside the .score() function See: https://stackoverflow.com/questions/60018578/what-does-model-eval-do-in-pytorch
Thanks! I just fixed it.
Now I wonder whether I should also call model.eval()
to compute validation_obj
which is used for early stopping. What do you think? Would you be willing to test and share your experience?
Yes, you should call model.eval()
in all places where stochastic components are undesired (computing validation/test loss and predictions. I can test the framework after you push the updates :-)
They are already pushed. You can install with pip install git+https://github.com/lasso-net/lassonet
or clone and pip install -e .
.
I don't know if validation requires deterministic loss. I asked on our internal mailing list and will see on our next group meeting!