HeLLST4R's repositories
AI-Audio-Data-Poisoning
AI Audio Data Poisoning is a Python script that demonstrates how to add adversarial noise to audio data. This technique, known as audio data poisoning, involves injecting imperceptible noise into audio files to manipulate the behavior of AI systems trained on this data.
AI-Image-Data-Poisoning
AI Image Data Poisoning is a Python script that demonstrates how to add imperceptible perturbations to images, known as adversarial noise, which can disrupt the training process of AI models.
AI-Vulnerability-Assessment-Framework
The AI Vulnerability Assessment Framework is an open-source checklist designed to guide users through the process of assessing the vulnerability of artificial intelligence (AI) systems to various types of attacks and security threats
ASCII-Art-Prompt-Injection
ASCII Art Prompt Injection is a novel approach to hacking AI assistants using ASCII art. This project leverages the distracting nature of ASCII art to bypass security measures and inject prompts into large language models, such as GPT-4, leading them to provide unintended or harmful responses.
axiom
The dynamic infrastructure framework for everybody! Distribute the workload of many different scanning tools with ease, including nmap, ffuf, masscan, nuclei, meg and many more!
bounty-targets
This project crawls bug bounty platform scopes (like Hackerone/Bugcrowd/Intigriti/etc) hourly and dumps them into the bounty-targets-data repo
bounty-targets-data
This repo contains hourly-updated data dumps of bug bounty platform scopes (like Hackerone/Bugcrowd/Intigriti/etc) that are eligible for reports
fleex
Fleex makes it easy to create multiple VPS on cloud providers and use them to distribute workloads.
HackerGPT
The official HackerGPT repository
HeLLST4R
Config files for my GitHub profile.
Image-Prompt-Injection
Image Prompt Injection is a Python script that demonstrates how to embed a secret prompt within an image using steganography techniques. This hidden prompt can be later extracted by an AI system for analysis, enabling covert communication with AI models through images.
nuclei
Fast and customizable vulnerability scanner based on simple YAML based DSL.
Nuclei-Templates-Collection
Nuclei Templates Collection
open-webui
User-friendly WebUI for LLMs (Formerly Ollama WebUI)
Payloads4All
A list of useful payloads and bypass for Web Application Security and Pentest/CTF
Prompt-Injection-Testing-Tool
The Prompt Injection Testing Tool is a Python script designed to assess the security of your AI system's prompt handling against a predefined list of user prompts commonly used for injection attacks. This tool utilizes the OpenAI GPT-3.5 model to generate responses to system-user prompt pairs and outputs the results to a CSV file for analysis.
tiktokenizer
Online playground for OpenAPI tokenizers