How to define optimizer other than SGD? Like Adam etc.
hamzagorgulu opened this issue · comments
hamzagorgulu commented
I wonder how can I use optimizer as my loss function. I change the optimizer type to adam but the adam optimizer requires beta values. When I define beta values as "betas", the runx gives error as:
train.py: error: unrecognized arguments: --betas 0.9
Any help about this?