tools-only / evadingfakedetector

We propose a statistical consistency attack (StatAttack) against diverse DeepFake detectors.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Evadingfakedetector

We propose a statistical consistency attack (StatAttack) against diverse DeepFake detectors. Alt text

Requirements

  • numpy==1.24.1
  • opencv_python==4.8.0.76
  • Pillow==10.0.0
  • scikit_learn==1.3.0
  • scipy==1.11.1
  • torch==2.0.1+cu118
  • torchattacks==3.4.0
  • torchvision==0.15.2+cu118
  • umap==0.1.1

DataSet

we conduct a comprehensive evaluation based on 4 generation methods. The generated face dataset includes entire face synthesis images, face identity swap images, and face manipulation images

Usage

  1. Clone this repository and install the required modules as listed in requirements.txt.
  2. Place the detection model code in the ./model directory. Follow the resnet50.py example inside the ./model folder to add hooks for obtaining the mmd loss.
  3. Run demo.py to generate adversarial examples.

Bibtex

@inproceedings{hou2023evading,
  title={Evading DeepFake Detectors via Adversarial Statistical Consistency},
  author={Hou, Yang and Guo, Qing and Huang, Yihao and Xie, Xiaofei and Ma, Lei and Zhao, Jianjun},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={12271--12280},
  year={2023}
}

About

We propose a statistical consistency attack (StatAttack) against diverse DeepFake detectors.


Languages

Language:Python 100.0%