curiousily / Credit-Card-Fraud-Detection-using-Autoencoders-in-Keras

iPython notebook and pre-trained model that shows how to build deep Autoencoder in Keras for Anomaly Detection in credit card transactions data

Home Page:https://www.curiousily.com/posts/credit-card-fraud-detection-using-autoencoders-in-keras/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why are you using relu on the last layer?

qbx2 opened this issue · comments

input_layer = Input(shape=(input_dim, ))

encoder = Dense(encoding_dim, activation="tanh", 
                activity_regularizer=regularizers.l1(10e-5))(input_layer)
encoder = Dense(int(encoding_dim / 2), activation="relu")(encoder)

decoder = Dense(int(encoding_dim / 2), activation='tanh')(encoder)
decoder = Dense(input_dim, activation='relu')(decoder)

autoencoder = Model(inputs=input_layer, outputs=decoder)

From the fraud_detection.ipynb, there's model using relu as last layer. However, the csv file contains negative values which relu cannot represent. I think the last layer of decoder should represent the input value. Wouldn't it be an issue?

Thanks.

I have the same issue, the construction change in construction may only related to regularization methods,
But this may not related to activation choose.