hakluke / hakrawler

Simple, fast web crawler designed for easy, quick discovery of endpoints and assets within a web application

Home Page:https://hakluke.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Terminated by signal SIGKILL (Force quit)

vhgbao01 opened this issue · comments

commented

This happened to me 2 weeks ago, it was running fine before that. I'm currently running hakrawler using tmux on kali on AWS EC2.
Command: cat subdomains_probed.txt | hakrawler -u -subs -insecure -d 6 -t 1 > hakrawler.txt
Result: fish: Process 12578, 'hakrawler' from job 1, 'cat subdomains_probed.txt…' terminated by signal SIGKILL (Forced quit)
The result still outputs to hakrawler.txt but the last URL is suddenly cut, not the full one

Potentially a lack of memory?

commented

Hi, I can confirm that it worked fine on my pc. I was wondering why the memory is lacking although I used only 1 thread.

Not sure, how many lines were in subdomains_probed.txt?

Given that it's a SIGKILL, it may have just been a Ctrl+C on the terminal?

commented

About 310 lines, the result was generated by httprobe.
I don't think that I used Ctrl+C to kill the process, I just simply detached tmux session using Ctrl+B D after running the command and left it there for a few days. My previous works are running fine, only this one got an error.

If you we're using the unique flag, and were running the program for multiple days, you probably filled up your memory with sm sync.Map.

commented

Interesting, I guess that piping the result to sort -u as mentioned in #138 is the solution. Thanks for the information guys.