feiyizdx / Taylor-net

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Symplectic Taylor Neural Networks

Summary

We propose an effective and light-weighted learning algorithm, Symplectic Taylor Neural Networks (Taylor-nets), to conduct continuous, long-term predictions of a complex Hamiltonian dynamic system based on sparse, short-term observations. At the heart of our algorithm is a novel neural network architecture consisting of two sub-networks. Both are embedded with terms in the form of Taylor series expansion that are designed with a symmetric structure. The key mechanism underpinning our infrastructure is the strong expressiveness and special symmetric property of the Taylor series expansion, which can inherently accommodate the numerical fitting process of the spatial derivatives of the Hamiltonian as well as preserve its symplectic structure. We further incorporate a fourth-order symplectic integrator in conjunction with neural ODEs’ framework into our Taylor-net architecture to learn the continuous time evolution of the target systems while preserving their symplectic structures simultaneously.

Prerequisites

Python should be installed in order to run the program. In a newly created virtual environmnent, run the following command:

pip install -r requirements.txt

All the required dependencies will then be installed.

Usage

We demonstrated the efficacy of our Tayler-net in predicting a broad spectrum of Hamiltonian dynamic systems, including the pendulum, the Lotka-Volterra, the Kepler, and the Hénon–Heiles systems.

In order to train a Taylor-net:

  • Pendulum: python3 Pendulum/Pendulum.py
  • Lotka-Volterra: python3 Lotka_Volterra/LV.py
  • Hénon-Heiles: python3 Henon_Heiles/Henon_Heiles.py

Problem setups

Problems Pendulum Lotka-Volterra Hénon-Heiles Kepler
Training period 0.01 0.01 0.01 0.01
Predicting period 20π 10 20π
Sample size 15 25 25 25
Epoch 100 150 100 50
Learning rate 0.002 0.003 0.001 0.001
step_size 10 10 10 10
γ 0.8 0.8 0.8 0.8
M* 8 8 12 20
Dimension of hidden layer 128 128 32 32

*M: the number of terms of the Taylor polynomial introduced in the construction of the neural networks

Results

Pendulum Problem: pendulem

Lotka-Volterra Problem: Lotka-Volterra

Hénon-Heiles Problem: Hénon-Heiles

Table of Losses

Problems Pendulum Lotka-Volterra Hénon-Heiles Kepler
Training loss 2.75e-05 2.37e-05 9.24e-06 7.29e-05
Testing loss 1.39e-04 6.73e-05 9.44e-06 6.41e-05

N-body

3bodies 6bodies

Dependencies

  • PyTorch
  • NumPy
  • h5py
  • Matplotlib

About


Languages

Language:Python 100.0%