Tib3rius / AutoRecon

AutoRecon is a multi-threaded network reconnaissance tool which performs automated enumeration of services.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Curl robots.tx not working

45im opened this issue · comments

Curl robots.txt always reports There did not appear to be a robots.txt file in the webroot dsepite having a non empty robots.txt in webroot.
Is there something amiss here

async def run(self, service):
		if service.protocol == 'tcp':
			process, stdout, _ = await service.execute('curl -sSikf {http_scheme}://{addressv6}:{port}/robots.txt', future_outfile='{protocol}_{port}_{http_scheme}_curl-robots.txt')

			lines = await stdout.readlines()

			if process.returncode == 0 and lines:
				filename = fformat('{scandir}/{protocol}_{port}_{http_scheme}_curl-robots.txt')
				with open(filename, mode='wt', encoding='utf8') as robots:
					robots.write('\n'.join(lines))
			else:
				service.info('{bblue}[' + fformat('{tag}') + ']{rst} There did not appear to be a robots.txt file in the webroot (/).')

I'm unable to reproduce this, however I could probably move this check over to use requests rather than curl which might make it more reliable.

Could you run the following manually though and provide me with a screenshot showing the result, as well as the exit code (e.g. echo $?)

curl -sSikf http://<target>/robots.txt

Screenshot 2023-02-17 101149

image

image

Yes manually it works 100%. exit code is 0 everytime.
I have rerun the the same host a few times and it does work about 20% of the times.
the if process.returncode == 0 and lines: is failing most of the times for some reason

On localhost works more reliably (about 80% success) but does sometimes miss it.

and when it does not work even the tcp_80_http_curl.html file is empty, so the issue seems to be related to the use of curl somehow. async / await not playing nicely with service.execute?

any movement on this Tib3rius?

I've ran into the same thing, fresh install from github using pipx. It will run for a bit identifying ports then hang on this last line.

└─$ sudo env "PATH=$PATH" autorecon 192.168.240.246
[*] Scanning target 192.168.240.246
[*] [192.168.240.246/all-tcp-ports] Discovered open port tcp/443 on 192.168.240.246
[*] [192.168.240.246/all-tcp-ports] Discovered open port tcp/80 on 192.168.240.246
[*] [192.168.240.246/all-tcp-ports] Discovered open port tcp/2222 on 192.168.240.246
[*] [192.168.240.246/tcp/80/http/vhost-enum] The target was not a hostname, nor was a hostname provided as an option. Skipping virtual host enumeration.
[*] [192.168.240.246/tcp/443/http/vhost-enum] The target was not a hostname, nor was a hostname provided as an option. Skipping virtual host enumeration.
[*] [192.168.240.246/tcp/80/http/known-security] [tcp/80/http/known-security] There did not appear to be a .well-known/security.txt file in the webroot (/).
[*] [192.168.240.246/tcp/80/http/curl-robots] [tcp/80/http/curl-robots] There did not appear to be a robots.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/http/known-security] [tcp/443/http/known-security] There did not appear to be a .well-known/security.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/http/curl-robots] [tcp/443/http/curl-robots] There did not appear to be a robots.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/https/vhost-enum] The target was not a hostname, nor was a hostname provided as an option. Skipping virtual host enumeration.
[*] [192.168.240.246/tcp/443/https/known-security] [tcp/443/https/known-security] There did not appear to be a .well-known/security.txt file in the webroot (/).
[*] [192.168.240.246/tcp/443/https/curl-robots] [tcp/443/https/curl-robots] There did not appear to be a robots.txt file in the webroot (/).

Breaking out of the command will give a TypeError: can only concatenate str (not "list") to str