AKASH2907 / deepfakes_video_classification

Deepfakes Video classification via CNN, LSTM, C3D and triplets [IWBF'20]

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

training data

ss880426 opened this issue · comments

May I ask your FF++c40_acc: 86%
is used
real + 4 different deepfakes for training
or
real + deepfakes for training
real + Face2Face for training
real + FaceSwap for training
real + NeuralTextures for training
and the results are averaged

thanks!!

It's NeuralTexture only.

Can you provide your batch size, learning rate and epoch?
Thanks

Batch size is mentioned in the code repo. Learning rate and epochs are in the paper. Please go thru the code and paper carefully before opening an issue. BS-32, LR-0.002, Epochs - 30-50

Sorry, I used FF++ c40 NeuralTextures 43k frames for training, and I didn't change any parameters, but the result is only 50 auc and acc, which step is wrong with me?

  1. frames_extraction.py
  2. face_extraction.py
  3. data_csv.py
  4. 08.facenet_embeddings.py
  5. 09.train_triplets_semi_hard.py
    thanks!!!!!!!

Use batch size of 256 or 512 for training triplet... You must be getting Nan loss with batch size 32.

I don't get any nan loss
I tried 32, 256 and 512, but the loss is all 0.4 and can't go down
The only change I made was to squeeze the training data from (43000,1,512) to (43000,512)

128 and below batch size gave Nan loss to me. However, after running evaluate_triplets.py your score is 50%?

yes auc:54 acc:51
I also trained deepfakes in train_triplets_semi_hard.py get acc: 98 but in evaluate_triplets.py acc is only 78

50% accuracy is random probability. The model hasn't learned anything. There's some issue.
I have only trained on c40 Neural Textures, not Deepfakes, so I can't comment.

My acc: 50 is only trained on c40 Neural Textures.
The only change I made on facenet was to add train_label += [label.argmax(1)]
becomes train_label += [label], but I don't think that's the problem
Or can you provide your pretrained model and datasets?

If I don't change it, it will report an error
'int' object has no attribute 'argmax'

I've tried this model and it doesn't get any better

Can you share your train_label?

Either SGD or random forest classifier have both scores around 50?

train_faces_25frames.csv
my shape is (38700, 512) (38700,) (4300, 512) (4300,)
label : [1 1 1 ... 0 0 0]
I trained 50 epoch and the loss rose to 0.8 and then dropped to 0.3
Then the acc of rf and sgd both have 96
But still only 50% on evaluate

Can you also share your testing_videos list?

Testing videos are from the same distribution that's why I'm not able to figure out why your score is dropping too much.

I divided NeuralTextures videos into 860 for train 140 for test

test.csv

Can you try it once again? Few minor changes in the evaluation script. And also can you share your other scores as well? P, R, ...

this is my result
AUC Score: 0.4739285714285715
Accuracy: 0.4785714285714286
Precision: 0.47761194029850745
Recall: 0.45714285714285713
F1 score: 0.46715328467153283

I think there seems to be something wrong with your code, it will give the opposite answer

probab_mean = 1 - probab_mean
y_probabilities +=[probab_mean]
# print(pred_mean)
if pred_mean>0.5:
y_predictions+=[0]
else:
y_predictions+=[1]

But even on the contrary, my result is 50%

I tried to run train_triplets_semi_hard.py -f false with that 140 video test data after facenet , it is still 50%

I don't have the dataset now and I'm trying to find some version of the codes. It's been 2.5+ years. I don't have the set of train and test videos. But still, the data distribution is the same, so, the drop shouldn't be that much.

Test videos are evaluated with the evaluate_triplets. Try to reduce train size and see if test acc. decreases in case of train_triplets_semi_hard.py -f false. Ideally, it should. I have modified that piece of code in evaluate_triplets.

ok
thank you!