rmojgani / LPINNs

To address some of the failure modes in training of physics informed neural networks, a Lagrangian architecture is designed to conform to the direction of travel of information in convection-diffusion equations, i.e., method of characteristic; The repository includes a pytorch implementation of PINN and proposed LPINN with periodic boundary conditions

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Lagrangian physics-informed neural networks - LPINNs

Table of contents

Introduction

Physics–informed neural networks (PINNs) leverage neural–networks to find the solutions of partial differential equation (PDE)–constrained optimization problems with initial conditions and boundary conditions as soft constraints. We propose reformulating PINNs on a Lagrangian frame of reference, i.e., LPINNs, as a PDE–informed solution. A parallel architecture with two branches is proposed. One branch solves for the state variables on the characteristics, and the second branch solves for the low–dimensional characteristics curves. The proposed architecture conforms to the causality innate to the convection, and leverages the direction of travel of the information in the domain.

Our contribution is threefold

  • Explain complexity of training through lens of approximation theory,
  • Identify viscous Burgers' equation with moving shock as a challenging case, and
  • Propose Lagrangian PINNs as a causality conforming architecture for convection-dominated convection diffusion PDEs.

Requirements

To do:

Following codes will be added soon:

  • Lagrangian PINNs for 2D convection
  • The function to call PyHessian and export the loss landscape
  • Matlab file to plot the landscapes
  • Pytorch codes for
    • Sequence to sequence learning [2]
    • Extended sequence to sequence learning (our contribution) [0]
    • Curriculum learning [2]

Experiments

Main file

LPINN.py 

accepts the following arguments


# Case parameters
parser.add_argument('--EQN_TYPE', type=str, default='convection',\
                    choices=['Burgers', 'convection', 'reaction_diffusion', 'reaction'], help='Equation type')
parser.add_argument('--C', type=float, default='50.0', help='Convection/wave speed')
parser.add_argument('--NU0', type=float, default='0.01', help='Viscosity')

parser.add_argument('--U0_TYPE', type=str, default='gauss',\
                    choices=['exp' , 'gauss', 'sin','bell','sin(x)'], help='Initial condition case')
parser.add_argument('--to', type=float, default=1.0, help='t_{max}')
    
# Architecture parameters
parser.add_argument('--NET_TYPE', type=str, default='LPINN_POLAR',\
                    choices=['LPINN_POLAR', 'PINN_POLAR'], help='Network type')
parser.add_argument('--DEEPu', type=int, default=5, choices=range(1, 10), help='u-Network deep layers')
parser.add_argument('--DEEPx', type=int, default=2, choices=range(1, 10), help='x-Network deep layers --- TO BE developed')
parser.add_argument('--HIDDEN', type=int, default=50, choices=range(1, 10), help='Nodes in deep layers')
parser.add_argument('--SEED', type=int, default=0, help='Pseudop-random seed')

# Data parameters
parser.add_argument('--N', type=int, default=256, choices=range(200, 500), help='Space')
parser.add_argument('--M', type=int, default=100, choices=range(100, 1000), help='Time')

# Optimizer parameters
parser.add_argument('--NUM_EPOCHS_ADAM', type=int, default=int(1e6), help='Number of epoch, ADAM')
parser.add_argument('--NUM_EPOCHS_SGD', type=int, default=int(0), help='Number of epoch, SGD')
parser.add_argument('--NUM_EPOCHS_BFGS', type=int, default=int(1e2), help='Number of epoch, BFGS')
parser.add_argument('--LR0', type=float, default=0.01, help='[Initial] learning rate')

parser.add_argument('--GAMMA_RX', type=float, default=10.0, help='GAMMA_RX')
parser.add_argument('--GAMMA_RU', type=float, default=1.0, help='GAMMA_RU')
parser.add_argument('--GAMMA_IC', type=float, default=1000.0, help='GAMMA_IC')
parser.add_argument('--GAMMA_BC', type=float, default=10.0, help='GAMMA_BC')

Train bash

bash train_LPINN.sh

Post-process bash

bash post_LPINN.sh

How to cite?

To appear in Computer Methods in Applied Mechanics and Engineering

  • [0] Mojgani, R., Balajewicz, M., and Hassanzadeh, P.. "Kolmogorov n–width and Lagrangian physics-informed neural networks: A causality-conforming manifold for convection-dominated PDEs", Computer Methods in Applied Mechanics and Engineering, Volume 404, pp. 115810, 2023.( arXiv, Elsevier )
    BibTeX
    @@article{Mojgani_CMAME_2023,
    author = {Rambod Mojgani and Maciej Balajewicz and Pedram Hassanzadeh},
    title = {Kolmogorov n–width and {L}agrangian physics-informed neural networks: {A} causality-conforming manifold for convection-dominated {PDE}s},
    journal = {Computer Methods in Applied Mechanics and Engineering},
    volume = {404},
    pages = {115810},
    year = {2023},
    issn = {0045-7825},
    archivePrefix="arXiv",
    eprint={2205.02902},
    doi = {https://doi.org/10.1016/j.cma.2022.115810 },
    url = {https://www.sciencedirect.com/science/article/pii/S0045782522007666 },
    }

References

The Lagrangian framework for data-driven modeling of convection dominated flows was first introduced in [1], especifically in projection-based reduced order modeling (pROMs). The literature on challenges and remedies of training of PINNs is dissused in our paper [0]. This work is highly inspired by failure modes of PINNs [2], and discussion of causality in training of PINNs [3]. The code to compute the loss landscape of NN is an opensource package, PyHessian [4], and is modified for our purpose under MIT License. Future work includes generalization of our method to address Kolmogorov n-width using low-rank registeration based auto-encoder/manifold [5]

About

To address some of the failure modes in training of physics informed neural networks, a Lagrangian architecture is designed to conform to the direction of travel of information in convection-diffusion equations, i.e., method of characteristic; The repository includes a pytorch implementation of PINN and proposed LPINN with periodic boundary conditions


Languages

Language:Python 99.3%Language:Shell 0.7%