after giving domain name
r0x5r opened this issue · comments
└─# nf -d testphp.vulnweb.com
__ _ ____
____ __ / / () _/ __________ ___ _____
/ __ / / / / / / _ / / // / / / / / / _ / /
/ / / / // / // / __/ / __/ // / / // // __/ /
// //_,/___//_/// _,/ ///_/_/ v1.0.2
Made by Satya Prakash (0xKayala)
Running ParamSpider on testphp.vulnweb.com
___ _ __
/ _ \___ ________ ___ _ ___ ___ (_)__/ /__ ____
/ ___/ _ `/ __/ _ `/ ' \(_-</ _ \/ / _ / -_) __/
/_/ \_,_/_/ \_,_/_/_/_/___/ .__/_/\_,_/\__/_/
/_/
- coded with <3 by Devansh Batham
[!] URLS containing these extensions will be excluded from the results : ['.png', '.jpg', '.gif', '.jpeg', '.swf', '.woff', '.svg', '.pdf', '.json', '.css', '.js', '.webp', '.woff', '.woff2', '.eot', '.ttf', '.otf', '.mp4', '.txt']
[+] Total number of retries: 0
[+] Total unique urls found : 216
[+] Output is saved here : output/testphp.vulnweb.com.yaml
[!] Total execution time : 1.5574s
Running Nuclei on collected URLs
flag provided but not defined: -fuzz
Scan is completed - Happy Fuzzing
nucllie fuzzer not running automatic we need to give this command nuclei -t fuzzing-templates -list fuzz_endpoints.txt all the time
Try to run the command a couple of times and outside of the root
Hi, @0xKayala
Thank you for the great tool.
I have the same problem
And All the required Tools works well in my terminal
Also I have all required directories in $HOME
I have used the below command in the NucleiFuzzer Script to run Nuclei tool on the collected URLs from Paramspider.
sort "output/$domain.yaml" | uniq | tee "output/$domain.yaml" | httpx -silent -mc 200,301,302,403 | nuclei -t "$home_dir/fuzzing-templates" -fuzz -rl 05
So, Sometimes the collected URLs were captured or passed to nuclei due to some issue with the Nuclei Tool. This is a known issue as it is happening even if I modify the script as shown below.
nuclei -l "output/$domain.yaml" -t "$home_dir/fuzzing-templates" -fuzz -rl 05
Just waiting for the update of Nuclei tool from @projectdiscovery team in order to fix this issue in capturing or passing the URLs to Nuclei tool.
However, if you run the current existing script for 3 to 4 times, it will work.
bhai ek hi screenshot lo na yeh dusra screen shot asa alg lerha ho kuch tho gadbad hai
@r0x5r Bro, Ek screenshot me pura nahi dhikra.
Isliye jithna dhikra hai, uthna capture kiya hai
For your satisfaction here is the single screenshot below
Hi, @0xKayala
I have fixed the issue just by changing the file extension in the tee
command in NucleiFuzzer.sh
file
from tee "output/$domain.yaml"
to tee "output/$domain.txt"
The tool works very well now.
Hope This Helps!
Previously I have used TXT file only but still I faced the same issue. That is why I changed it to yaml as it saves too fast for yaml format.
I didn't face any problems since I have edited it. However, The tool is a little bit slow to initialize the scan.
sort "output/$domain.yaml" - I think you forgot to change the extension here
Hi, @0xKayala I have fixed the issue just by changing the file extension in the
tee
command inNucleiFuzzer.sh
file fromtee "output/$domain.yaml"
totee "output/$domain.txt"
The tool works very well now.
Hope This Helps!
The issue with the script lies in how the output files are handled and passed to the nuclei command. Specifically, the output file generated by ParamSpider
(output/$domain.yaml) is being overwritten by the tee command before it can be passed to httpx
and then to nuclei
. This is why nuclei
is not receiving the URLs for scanning.
To fix this, I have modified the script to use a temporary file to store the sorted and unique URLs before passing them to httpx
and nuclei
. Here's the updated portion of the script:
# Step 5: Run the Nuclei Fuzzing templates on the collected URLs
echo "Running Nuclei on collected URLs"
if [ -n "$domain" ]; then
# Use a temporary file to store the sorted and unique URLs
temp_file=$(mktemp)
sort "output/$domain.yaml" | uniq > "$temp_file"
httpx -silent -mc 200,301,302,403 -l "$temp_file" | nuclei -t "$home_dir/fuzzing-templates" -fuzz -rl 05
rm "$temp_file" # Remove the temporary file
elif [ -n "$filename" ]; then
sort "$output_file" | uniq > "$temp_file"
httpx -silent -mc 200,301,302,403 -l "$temp_file" | nuclei -t "$home_dir/fuzzing-templates" -fuzz -rl 05
rm "$temp_file" # Remove the temporary file
fi
This modification creates a temporary file ($temp_file) to store the sorted and unique URLs, which are then passed to httpx
for scanning. After the scanning is complete, the temporary file is removed. This should ensure that the collected URLs are properly passed to nuclei
for scanning.