ZackAXue / PINN_Stan

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-scalable Tanh (Stan) Activation Function

Applied to Physics-informed Neural Networks (PINN)

This is the official repository for Physics-informed Neural Networks with Stan (Self-scalable Tanh) activation function.

$Stan(x) = tanh(x) + \beta \times x \times tanh(x)$

$\beta$ is a trainable neuron-wise parameter. You can also find some other implementations like the NVIDIA's. Video shows a potential application of PINN to grab your attention (gotcha!).

Note: the code for generating this figure is not in the repository.

The codes are being cleaned for easy usage. Meanwhile, if you already have a PINN code and want to implement the activation function, it should be straightforward. In PyTorch, if you have the activation function as $tanh$ you can simply modify it as follows

#Initialization
self.beta = Parameter(torch.ones((NN_width,len(layers)-2))) 
self.beta.requiresGrad = True #Add this to the weights and biases initializations
#In your 'forward pass' function
z = self.activation(x) #tanh 
z = z + self.beta[:,i]* x * z # i denotes the layer index 

Please cite if you benefit from this work. [https://ieeexplore.ieee.org/document/10227556].

Note: DO NOT use the Arxiv preprint version of the article; there are several inaccuracies.

Citation (BibTeX):

@article{gnanasambandam2023self,
  title={Self-scalable Tanh (Stan): Multi-Scale Solutions for Physics-Informed Neural Networks},
  author={Gnanasambandam, Raghav and Shen, Bo and Chung, Jihoon and Yue, Xubo and Kong, Zhenyu},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2023},
  publisher={IEEE}
}

R. Gnanasambandam, B. Shen, J. Chung, X. Yue and Z. Kong, "Self-scalable Tanh (Stan): Multi-Scale Solutions for Physics-Informed Neural Networks," in IEEE Transactions on Pattern Analysis and Machine Intelligence, doi: 10.1109/TPAMI.2023.3307688.

About


Languages

Language:Jupyter Notebook 100.0%