jayroxis / PINNs

PyTorch Implementation of Physics-informed Neural Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

why two optimizers are needed during the training?

manwu1994 opened this issue · comments

commented

Hello, thanks a lot for your hard work and your sharing.
I would like to ask why two optimizers are needed during the training.
self.optimizer.step(self.loss_func)
self.optimizer_Adam.step()

Thank you so much in advance for your answers.

Hi,

The original PINN uses two optimizers, Adam for initial optimization since it is generally more stable, and later use second order optimizer LBFGS for finetuning to reach higher accuracy.