SkyTu / The-Attention-and-Autoencoder-Hybrid-Learning-Model

• A mechanism to prolong the prediction time span of the concentration of PM2.5. • A hybrid attention mechanism taking decoder sequence into consideration, paying due attention to data in current time period. • More constraints conducted on the data with longer time span to current time.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The-Attention-and-Autoencoder-Hybrid-Learning-Model

• A mechanism to prolong the prediction time span of the concentration of PM2.5. • A hybrid attention mechanism taking decoder sequence into consideration, paying due attention to data in current time period. • More constraints conducted on the data with longer time span to current time.

The patent of this project belongs to Shanghai Normal University.

A non-stadardized code for air pollution prediction models, using lstm, gru, pure attention, pure encoder decoder and the A-A model proposed in my essay "Longer Time-Span Air Pollution Prediction: The Attention and Autoencoder Hybrid Learning Model"

In this repository, codes are separated into two parts, first for data regulization and segmentation and the second for the experiment of the models and related data is also updated.

The concated data is the finally used data, and the raw data can be finded in the U.S. National Climate Data Center and the China Meteorological Bureau.

The parameters used in the model can be found in the package called reused parameters, it should be marked as source root.

The structure of the model has been submitted, while parameters should be tuned carefully for different dataset.

About

• A mechanism to prolong the prediction time span of the concentration of PM2.5. • A hybrid attention mechanism taking decoder sequence into consideration, paying due attention to data in current time period. • More constraints conducted on the data with longer time span to current time.


Languages

Language:Jupyter Notebook 99.8%Language:Python 0.2%