R0X4R / Garud

An automation tool that scans sub-domains, sub-domain takeover, then filters out XSS, SSTI, SSRF, and more injection point parameters and scans for some low hanging vulnerabilities automatically.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Stuck at the crawling part - Gospider.txt output: unknown flag: --js

sh0tal opened this issue · comments

Hello guys,

I installed Garud on a kali linux VM, and when i run the command, i cant get it going more than the crawling part. It stuck at the crawling part and i managed to found in the gospider.txt (from the output of the scan) the following:
unknown flag: --js
Which is the reason that it get stuck at the crawling part i suppose.

Looking at the garud code, i found the line that causes the problem:
gospider -S enumeration/liveurls.txt -d 10 -c 20 -t 50 -K 3 --no-redirect --js -a -w --blacklist ".(eot|jpg|jpeg|gif|css|tif|tiff|png|ttf|otf|woff|woff2|ico|svg|txt)" --include-subs -q -o enumeration/temp/gospider 2> /dev/null | anew -q enumeration/parameters/gospider.txt && rm -rf out/ &> /dev/null

Can anyone help?
Is this something that has to do with a flag regarding gospider?
I couldnt find any flag --js in the gospiders' flags.

Hello @sh0tal,
Thanks for raising the issue.

--js is a valid argument in gospider you can check it using gospider -h command

--js                        Enable linkfinder in javascript file (default true)

Maybe the issue is from your side or from the gospider. You can kill all the gospider process running on you system so garud will skip to the next step.

To kill all gospider processes

$ ps aux | grep gospider | sed '/grep/d' | awk '{print $2}' | while read -r line; do kill $line; done

Regards,
R0X4R

Hello @R0X4R ,

Thank you for your answer :)

The problem was that I had an outdated version of gospider ( v1.1.0), where the --js flag was missing, and I updated it to the newest (v1.1.6).

Then, I run Garud again but it got stuck at the crawling part again. - I left it running for about 15 hours, scanning hackerone.com (24 subdomains found).

After looking into the results txts' in the parameters folder, I found out that gospider.txt was empty (the file that had the error with the --js flag) and the others (gauplus.txt jslinks.txt wayback.txt) were okay (had results in them).

I suppose that the problem is still when executing the gospider command.
I will keep looking to find out what is going wrong and let you know if i find something or any fix i will come up with.

Hey @sh0tal,
Thanks for your reply.

I have released the latest version of Garud, check it out. I have added the timeout option in gospider hope it will fix your issue.

Regards,
R0X4R

Hello, @R0X4R

Thank you for the update.
I will give it a try and let you know if any problem arises.

Hey @sh0tal,
Greetings of the day,

It's been 1 month, I think your issue is resolved. Closing this issue.

Thanks.