Madoshakalaka / EvoSeed

Source code for the article, "EvoSeed: Unveiling the Threat on Deep Neural Networks with Real-World Illusions"

Home Page:https://shashankkotyan.github.io/EvoSeed/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EvoSeed EvoSeed EvoSeed

Publication Source for the article: Breaking Free: How to Hack Safety Guardrails in Black-Box Diffusion Models!

Contributions Key Contributions:

  • We propose a black\nobreakdash-box algorithmic framework based on an Evolutionary Strategy titled EvoSeed to generate natural adversarial samples in an unrestricted setting.
  • Our results show that adversarial samples created using EvoSeed are photo-realistic and do not change the human perception of the generated image; however, can be misclassified by various robust and non-robust classifiers.
Figure: Adversarial images created with EvoSeed are prime examples of how to deceive a range of classifiers tailored for various tasks. Note that, the generated natural adversarial images differ from non-adversarial ones, suggesting the adversarial images' unrestricted nature.

Reference Citation:

If you find this project useful please cite:

@article{kotyan2024EvoSeed,
  title = {Breaking Free: How to Hack Safety Guardrails in Black-Box Diffusion Models!,
  author = {Kotyan, Shashank and Mao, Po-Yuan and Chen, Pin-Yu and Vargas, Danilo Vasconcellos},
  year = {2024},
  month = may,
  number = {arXiv:2402.04699},
  eprint = {2402.04699},
  publisher = {{arXiv}},
  doi = {10.48550/arXiv.2402.04699},
}

About

Source code for the article, "EvoSeed: Unveiling the Threat on Deep Neural Networks with Real-World Illusions"

https://shashankkotyan.github.io/EvoSeed/

License:Creative Commons Zero v1.0 Universal


Languages

Language:Python 76.3%Language:HTML 17.2%Language:CSS 6.0%Language:TeX 0.5%