RahulSundar / Rowdy_Activation_Functions

We propose Deep Kronecker Neural Network, which is a general framework for neural networks with adaptive activation functions. In particular we proposed Rowdy activation functions that inject sinusoidal fluctuations thereby allows the optimizer to exploit more and train the network faster. Various test cases ranging from function approximation, inferring the PDE solution, and the standard deep learning benchmarks like MNIST, CIFAR-10, CIFAR-100, SVHN etc are solved to show the efficacy of the proposed activation functions.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Deep Kronecker Neural Networks : A general framework for adaptive activation functions (Rowdy Activation Functions)

The Rowdy activation function codes written in Tensorflow 1

We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions. KNNs employ the Kronecker product, which provides an efficient way of constructing a very wide network while keeping thenumber of parameters low. Our theoretical analysis reveals that under suitable conditions, KNNsinduce a faster decay of the loss than that by the feed-forward networks. This is also empirically verified through a set of computational examples. Furthermore, under certain technical assumptions,we establish global convergence of gradient descent for KNNs. As a specific case, we propose the Rowdy activation function that is designed to get rid of any saturation region by injecting sinusoidal fluctuations, which include trainable parameters. The proposed Rowdy activation function can beemployed in any neural network architecture like feed-forward neural networks, Recurrent neural networks, Convolutional neural networks etc. The effectiveness of KNNs with Rowdy activation isdemonstrated through various computational experiments including function approximation using feed-forward neural networks, solution inference of partial differential equations using the physics-informed neural networks, and standard deep learning benchmark problems using convolutional and fully-connected neural networks.

CODE: High frequency function approximation (one-dimensional)

Reference for Rowdy activation functions:

  1. A.D. Jagtap, Y. Shin, K. Kawaguchi, G.E. Karniadakis, Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions, Neurocomputing, 468, 165-180, 2022. (https://www.sciencedirect.com/science/article/pii/S0925231221015162)

    @article{jagtap2022deep,
    title={Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions},
    author={Jagtap, Ameya D and Shin, Yeonjong and Kawaguchi, Kenji and Karniadakis, George Em},
    journal={Neurocomputing},
    volume={468},
    pages={165--180},
    year={2022},
    publisher={Elsevier}
    }
    
  2. A.D. Jagtap, K.Kawaguchi, G.E.Karniadakis, Adaptive activation functions accelerate convergence in deep and physics-informed neural networks, Journal of Computational Physics, 404 (2020) 109136. (https://doi.org/10.1016/j.jcp.2019.109136)

    @article{jagtap2020adaptive,
    title={Adaptive activation functions accelerate convergence in deep and physics-informed neural networks},
    author={Jagtap, Ameya D and Kawaguchi, Kenji and Karniadakis, George Em},
    journal={Journal of Computational Physics},
    volume={404},
    pages={109136},
    year={2020},
    publisher={Elsevier}
    }
    
  3. A.D.Jagtap, K.Kawaguchi, G.E.Karniadakis, Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 20200334, 2020. (http://dx.doi.org/10.1098/rspa.2020.0334).

    @article{jagtap2020locally,
    title={Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks},
    author={Jagtap, Ameya D and Kawaguchi, Kenji and Em Karniadakis, George},
    journal={Proceedings of the Royal Society A},
    volume={476},
    number={2239},
    pages={20200334},
    year={2020},
    publisher={The Royal Society}
    }
    

Please feel free to ask your questions.

About

We propose Deep Kronecker Neural Network, which is a general framework for neural networks with adaptive activation functions. In particular we proposed Rowdy activation functions that inject sinusoidal fluctuations thereby allows the optimizer to exploit more and train the network faster. Various test cases ranging from function approximation, inferring the PDE solution, and the standard deep learning benchmarks like MNIST, CIFAR-10, CIFAR-100, SVHN etc are solved to show the efficacy of the proposed activation functions.


Languages

Language:Python 100.0%