cldrn / rainmap-lite

Rainmap Lite - Responsive web based interface that allows users to launch Nmap scans from their mobiles/tablets/web browsers!

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Automatic Scan

exploitprotocol opened this issue · comments

First of all, thanks a lot for your awesome work, it really helps.

Instead of using cronjob, it would be better if scans can be triggered from web server itself.

Like suppose if i set cronjob time very low like 5 minutes, then duplicates scans happen.

Let me explain that.

  • Suppose i entered a list of ips and now cronjob has started that scanning script.(Job A)
  • Now i added one more list of ips for scan
  • First scan of ips is not yet completed, so a different Job is created, lets call it Job B
  • Now when Job A will be completed, the scripts will fetch if there is any pending scans, despite Job B in running state, it will start running scan for those ips.

This process goes on and on, that is why i have stopped using cronjob, i manually fire nmaper-cronjob.py, but in case if i am on tablet or somewhere there is no terminal access, i cannot start that scan.

Plus overall even if you think, it will be good if nmap scans can be run in background and can be triggered directly from admin panel, not using cronjob. Plus there should be a history section if that particular ip or host was scanned before.

Hello,

You have brought up a good point. Cron isn't as robust as having proper queues such as RabbitMQ. I considered this when I wrote the application but having queues also meant having to install another service, so another dependency to the project. One of the issues I had when I tried to revive the previous Rainmap project was that it had a lot of dependencies, some of them to obsolete projects. For this reason I tried to keep it simple and went with cron. Although I see all the advantages RabbitMQ can bring to the project, simple installation with the least amount of dependencies was the original goal.

However, right now, the script shouldn't have an issue with duplicates. The way it works:

  1. Cron calls the script and only selects the scans with status 'waiting'. Waiting meaning that the job hasn't executed yet. There are three states: waiting, running and finished.
  2. When a job is selected, the status is changed to 'running'. If the cron script gets executed again the job shouldn't be selected again because it only selects jobs with state 'waiting'.
  3. When a job is complete, the status is changed to 'finished'.

If you spotted a bug with the current system, I would appreciate if you can send more details so I can look into it. Maybe in future versions we will move away from cron. This project started as a private project that went public. There are a lot of use cases that I didn't consider when I wrote this originally.

Ps. Thanks for the suggestion about adding a scan history. Historical results are nice if we are using the tool to scan periodically the same hosts.

Thanks for all your suggestions. Let me know if I can help you solve your issue with duplicate scans. It shouldn't happen.

Hi there,

Thanks a lot for your reply and sorry for my late reply.
Yeah i can understand that you wanted it to be a simple application intially, but i thought to add suggestions here.

On that duplicate issue, i also went through the code and according to me also thoe duplicate results should not be there but i have seen them sometimes.

Next time, when that happen i will see what that caused. Duplicate do not happen everytime.

Thanks a lot for your awesome work. Keep it up 👍

Thanks for the feedback. I appreciate it a lot and I will keep it for the next versions.