calc1f4r / Recusive-web-crawler

"Recursive Web Crawler: A Python tool for deep website exploration, finding subdomains, links, and JavaScript files. Ideal for security and web development."

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Recursive Web Crawler

Recursive Web Crawler is a Python-based tool for exploring websites recursively and extracting useful information such as subdomains, links, and JavaScript files. This tool is intended for web security professionals and web developers who want to examine the structure and dependencies of websites.

Recursive.Web.Crawler.mp4

Features

  • Recursively crawl websites to a specified depth.
  • Extract subdomains, links, and JavaScript files.
  • Flexible depth parameter for customizing the level of recursion.
Usage

Run the Recursive Web Crawler with the following command:

python main.py -u <URL> -d <DEPTH>

Example

python main.py -u "'https://tryhackme.com -d 2
Contributing

If you'd like to contribute to this project, please open an issue or create a pull request.

About

"Recursive Web Crawler: A Python tool for deep website exploration, finding subdomains, links, and JavaScript files. Ideal for security and web development."


Languages

Language:Python 100.0%