djm160830 / nn-enhanced

Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

train/test ratio: 80% training, 20% testing

How to run

Install requirements

pip install -r requirements.txt
python nn-enhanced.py

Expected outputs (sample output):

ACTIVATION: sigmoid
Total Test error with sigmoid activation:                             8.414172716246473
ACTIVATION: tanh

After 60000 iterations, total error with tanh activation function: 3.0528201232835785
ACTIVATION: tanh
Total Test error with tanh activation:                             7.406216006829898
ACTIVATION: relu

After 60000 iterations, total error with relu activation function: 8.424554943130456
ACTIVATION: relu
Total Test error with relu activation:                             7.2460576025569505

About

Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions


Languages

Language:Python 100.0%