carloschavez9 / ensemble-time-series

Ensemble Machine Learning for Time Series: Ensemble of Deep Recurrent Neural Networks and Random forest using a Stacking (averaging) layer and trained using Genetic Algorithms

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EnsembleTimeSeries

Ensemble Machine Learning for Time Series: Ensemble of Deep Recurrent Neural Networks and Random forest using a Stacking (averaging) layer and trained using Genetic Algorithms.

Prerequisites

Python: 3.6.1
Numpy: 1.12.1
Pandas: 0.20.1
Keras: 2.0.6
Scikit-Learn: 0.18.1
Theano: 0.9.0
Tensorflow: 1.2.1
Pydot: 1.0.29
GraphViz: 2.38.0

Initial configuration

Ensemble configuration can be set using the following hyperparameters. These parameters are used to setup the Genetic Algorithm, Recurrent Neural Networks and Random Forests used in training for the model.

train_size_percentage = 0.9  # Training size
mutation_rate = 0.1  # Mutation rate for GA
min_mutation_momentum = 0.0001  # Min mutation momentum
max_mutation_momentum = 0.1  # Max mutation momentum
min_population = 20  # Min population for GA
max_population = 50  # Max population for GA
num_Iterations = 10  # Number of iterations to evaluate GA
look_back = 1  # Num of timespaces to look back for training and testing
max_dropout = 0.2  # Maximum percentage of dropout
min_num_layers = 1  # Min number of hidden layers
max_num_layers = 10  # Max number of hidden layers
min_num_neurons = 10  # Min number of neurons in hidden layers
max_num_neurons = 100  # Max number of neurons in hidden layers
min_num_estimators = 100  # Min number of random forest trees
max_num_estimators = 500  # Max number of random forest trees
rnn_epochs = 1  # Epochs for RNN

About

Ensemble Machine Learning for Time Series: Ensemble of Deep Recurrent Neural Networks and Random forest using a Stacking (averaging) layer and trained using Genetic Algorithms

License:MIT License


Languages

Language:Python 100.0%