lyn1874 / memAE

unofficial implementation of paper Memorizing Normality to Detect Anomaly: Memory-augmented Deep Autoencoder (MemAE) for Unsupervised Anomaly Detection

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AUC on UCSD Ped2

Markovcom opened this issue · comments

I train the models with your codes on UCSD Ped2 dataset, but I cannot get the report results.
image

------Data folder /data/datasets/ped2/testing/frames
------Model folder log/ped2/lr_0.00020_entropyloss_0.00020_version_0/
------Restored ckpt log/ped2/lr_0.00020_entropyloss_0.00020_version_0/model-0079.pt
data path: True
AutoEncoderCov3DMem
len data: 1830
The length of the reconstruction error is 1830
The length of the testing images is 1830
............start to checking the anomaly detection auc score...................
............use ckpt dir at step 79
Number of gt frames: 1830
Number of predictions: 1830
AUC score on data ped2 is 0.85

commented

Hi, Thanks for your interest in this implementation.

I have downloaded the repo and trained the model for the UCSDped2 dataset. My accuracy varies between 0.90 to 0.94 (Although I am not sure if this is reasonable since the authors didn't report the confidence interval of their accuracies). Also, the last ckpt model-0079.ckpt is not always the best, maybe you could also try to evaluate the ckpts at other steps.

How did you get this file? Avenue_gt.npy

@Markovcom What did you send? I can't see it.

./ckpt/Avenue_gt.npy

I know in this folder, is this file generated by program? @lyn1874

Hi, I trained the model with this repo but got some problems. When entropyloss_weight is 0 (lr_0.00010_entropyloss_0.00000_version_0), the model could got AUC 0.92 on UCSDped2. However, when entropyloss_weight is 0.0002 (lr_0.00010_entropyloss_0.00020_version_0), the model only got 0.7938 AUC. Could you check it why? And following the previous question (#4 (comment)), it is similarly that the learning rate setting is also important (lr_0.00010_entropyloss_0.00020_version_0, AUC=0.85). @lyn1874

commented

Hey, @interstate50 thanks for your interest. As for your questions:

  1. Yes, I am also facing the same issue that when I use the entropyloss_weight in the loss function, the AUC score gets worse sometimes. I think this is because some of the hyperparameters in my case are not the best. For example, I didn't tune the training epochs, learning rate decay method, optimization method.
  2. Yes, learning rate setting is important not only here, but probably in the training of every neural network