These files contain source codes we use in our paper for testing the forgery classification evaluation accuracy on the ForgeryNIR dataset. Considering that preprocessing might take a certain time, we provide feature extracted from these images by our trained feature extraction models, and provide ForgeryClassifier to obtain testing accuracy.
We also provide models trained on the wildDeepfake dataset.
- Anaconda3 (Python3.9, with Numpy etc.)
- Pytorch 1.12.0
ForgeryNIR Dataset contains 240,000 forgery NIR images:
- images generated via 4 different GAN techniques.
- images added different number of perturbation.
- images generated by different epoch models of the same GAN.
Dataset Name | Download | Images |
---|---|---|
ForgeryNIR | ForgeryNIR | 240,000 |
Before running these codes, you'd better check config.py
in the folder named wildDeepfake and edit them to suit your situation.
torchrun --nproc_per_node={the num of the GPUs} train.py
Feature and Model | Download |
---|---|
HFC-MFFD | BaiduNetDisk(ia97) |
After downloading the feature and trained models, the feature should be put to ./ForgeryNIR/feature
, and the models should be put to ./ForgeryNIR/model
. Otherwise, you should change the default path we declare in config.py
from the folder named ForgeryNIR.
python -m simple_test --train_dir std_multi --test_dir mix_multi
Before running these codes, you should install torch-dct
which can perform DCT transform on tensors.
pip install torch-dct
Model | Download |
---|---|
HFC-MFFD | BaiduNetDisk(tj6r) |
After downloading the trained models, the models should be put to ./WildDeepFake/model
. Otherwise, you should change the default path we declare in get_result_test.py
.
torchrun --nproc_per_node={the num of the GPUs} get_result_test.py
torchrun --nproc_per_node=1 get_score_save.py
@article{liu2022hierarchical,
title={Hierarchical Forgery Classifier On Multi-modality Face Forgery Clues},
author={Liu, Decheng and Zheng, Zeyang and Peng, Chunlei and Wang, Yukai and Wang, Nannan and Gao, Xinbo},
journal={arXiv preprint arXiv:2212.14629},
year={2022}
}