lollcat / fab-jax-old

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Compare optimizers

lollcat opened this issue · comments

Training regime is weird (new data being produced that improves in quality), therefore worth experimenting with hyper-params / different optimizers.

  • Run study of different optimizers of 16 dimensional many well

Adam is fine after stability fixes (i.e. add LU flow layer)