kaylode / mediaeval21-drsf

De-identify facial information in image and video by using Adversarial Attacks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MediaEval 2021 - Driving Road Safety Forward: Video Data Privacy

Facial Data De-identification with Adversarial Generation and Perturbation Methods

Task Overview

Data Overview

Method Overview

  • Run 01: FaceSwap
    • Try swapping the face to fully hide the true identity of the drivers while preserving other information, such as facial expressions, behaviours, etc.
  • Run 02: Adversarial Attack
    • While preventing and limiting unauthorized models from accessing and exploiting information of the de-identified data. The authorized model can access all information of the de-identified data.
Original image Adversarial Pertubation FaceSwap
screen screen screen

Working note

About

De-identify facial information in image and video by using Adversarial Attacks


Languages

Language:Jupyter Notebook 82.3%Language:Python 17.7%