aweb3r / bash-crawler

:computer: Get a site links with bash

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bash-crawler

Get a site links with bash

Requirements

  1. Get all the links from a website
  2. For each link, get all links of the page

Usage

$ crawler <site>

  Usage
    $ crawler [options] <site>

  Options
    --depth,    -d  Depth of the links searched   5
    --help,     -h  Prints this helps             false
    --list,     -l  Print a list formatted output false
    --verbose,  -v  Verbose output                false

  Examples
    $ crawler -d 5 www.github.com

Tests

Tests were made using bats framework, to run tests type:

$ bats --tap crawler.bats

License

MIT

About

:computer: Get a site links with bash

License:MIT License


Languages

Language:Shell 100.0%