Ameya D. Jagtap (AmeyaJagtap)

AmeyaJagtap

Geek Repo

Location:United States

Home Page:https://sites.google.com/view/ameyadjagtap/home

Github PK Tool:Github PK Tool

Ameya D. Jagtap's repositories

XPINNs

Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations

Conservative_PINNs

We propose a conservative physics-informed neural network (cPINN) on decompose domains for nonlinear conservation laws. The conservation property of cPINN is obtained by enforcing the flux continuity in the strong form along the sub-domain interfaces.

Language:PythonLicense:MITStargazers:48Issues:3Issues:2

Locally-Adaptive-Activation-Functions-Neural-Networks-

Python codes for Locally Adaptive Activation Function (LAAF) used in deep neural networks. Please cite this work as "A D Jagtap, K Kawaguchi, G E Karniadakis, Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 20200334, 2020. (http://dx.doi.org/10.1098/rspa.2020.0334)".

Language:PythonLicense:MITStargazers:36Issues:3Issues:1

XPINNs_TensorFlow-2

XPINN code written in TensorFlow 2

License:MITStargazers:23Issues:5Issues:0

Rowdy_Activation_Functions

We propose Deep Kronecker Neural Network, which is a general framework for neural networks with adaptive activation functions. In particular we proposed Rowdy activation functions that inject sinusoidal fluctuations thereby allows the optimizer to exploit more and train the network faster. Various test cases ranging from function approximation, inferring the PDE solution, and the standard deep learning benchmarks like MNIST, CIFAR-10, CIFAR-100, SVHN etc are solved to show the efficacy of the proposed activation functions.

Language:PythonLicense:MITStargazers:10Issues:3Issues:0

Adaptive_Activation_Functions

We proposed the simple adaptive activation functions deep neural networks. The proposed method is simple and easy to implement in any neural networks architecture.

License:MITStargazers:6Issues:2Issues:0

Error_estimates_PINN_and_XPINN_NonlinearPDEs

The first comprehensive theoretical analysis of PINNs (and XPINNs) for a prototypical nonlinear PDE, the Navier-Stokes equations are given.

Activation-functions-in-regression-and-classification

How important are How important are activation functions in regression and classification? A survey, performance comparison, and future directions

License:MITStargazers:1Issues:2Issues:0

fourier_neural_operator

Use Fourier transform to learn operators in differential equations.

Language:PythonStargazers:1Issues:1Issues:0

locally-adaptive-activation-functions

Simplified implementation of locally adaptive activation functions (LAAF) with slope recovery for deep and physics-informed neural networks (PINNs) in PyTorch.

Language:PythonLicense:MITStargazers:1Issues:1Issues:0

Physics_Informed_Deep_Learning

Short course on physics-informed deep learning

Language:PythonStargazers:1Issues:2Issues:0

POD-PINN

POD-PINN code and manuscript

Language:PythonLicense:GPL-3.0Stargazers:1Issues:3Issues:0

DeepHPMs

Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

PINNs

Physics Informed Deep Learning: Data-driven Solutions and Discovery of Nonlinear Partial Differential Equations

Language:PythonLicense:MITStargazers:0Issues:1Issues:0
Language:Jupyter NotebookStargazers:0Issues:1Issues:0