seomoz / reppy

Modern robots.txt Parser for Python

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Fails on untrusted SSL certificates

opened this issue · comments

When running this I get an error on an https website that has an untrusted certificate:
reppy.exceptions.ServerError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed

In the requests library setting verify=False does the trick. Is there an option w/ reppy for this?

I've not encountered this myself, but my inclination is to use the requests setting. You can also create a RobotsCache with your own requests.Session object (which may have its own option more directly).

ok. I got it working.

Is there an option to change the cache to a filesystem rather than store it in a dictionary?

Not currently, though I have no philosophical objection to having that option and would welcome a PR :-)

Thanks for the quick replies! I'll definitely look into adding that when I get some time.