DerHamm / proxy_requests

a class that uses scraped proxies to make an http GET/POST request (Python requests)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Python Proxy Requests | make an http GET/POST with a proxy scraped from https://www.sslproxies.org/

pypi.org: https://pypi.org/project/proxy-requests/

The ProxyRequests class first scrapes proxies from the web. Then it recursively attempts to make a request if the initial request with a proxy is unsuccessful.

Either copy the code and put where you want it, or download via pip:

pip install proxy-requests (or pip3)
from proxy_requests import ProxyRequests

or if you need the Basic Auth subclass as well:
from proxy_requests import ProxyRequests, ProxyRequestsBasicAuth

If the above import statement is used, method calls will be identical to the ones shown below. Pass a fully qualified URL when initializing an instance.

System Requirements: Python 3 and the requests module.

Runs on Linux and Windows (and Mac probably) - It may take a moment to run depending on the current proxy.
Each request with a proxy is set with an 8 second timeout in the event that the request takes too long (before trying the next proxy socket in the queue).

The ProxyRequestBasicAuth subclass has the methods get(), get_with_headers(), post(), post_with_headers(), post_file(), and post_file_with_headers() that will override the Parent methods.

GET:

    
r = ProxyRequests("https://api.ipify.org")
r.get()
    

GET with headers:

    
h = {"User-Agent": "NCSA Mosaic/3.0 (Windows 95)"}
r = ProxyRequests("url here")
r.set_headers(h)
r.get_with_headers()
    

POST:

    
r = ProxyRequests("url here")
r.post({"key1": "value1", "key2": "value2"})
    

POST with headers:

    
r = ProxyRequests("url here")
r.set_headers({"name": "rootVIII", "secret_message": "7Yufs9KIfj33d"})
r.post_with_headers({"key1": "value1", "key2": "value2"})
    

POST FILE:

    
r = ProxyRequests("url here")
r.set_file({'file': open('test.txt', 'rb')})
r.post_file()
    

POST FILE with headers:

    
h = {"User-Agent": "NCSA Mosaic/3.0 (Windows 95)"}
r = ProxyRequests("url here")
r.set_headers(h)
r.set_file({'file': open('test.txt', 'rb')})
r.post_file_with_headers()
    

GET with Basic Authentication:

    
r = ProxyRequestsBasicAuth("url here", "username", "password")
r.get()
    

GET with headers & Basic Authentication:

    
h = {"User-Agent": "NCSA Mosaic/3.0 (Windows 95)"}
r = ProxyRequestsBasicAuth("url here", "username", "password")
r.set_headers(h)
r.get_with_headers()
    

POST with Basic Authentication:

    
r = ProxyRequestsBasicAuth("url here", "username", "password")
r.post({"key1": "value1", "key2": "value2"})
    

POST with headers & Basic Authentication:

    
r = ProxyRequestsBasicAuth("url here", "username", "password")
r.set_headers({"header_key": "header_value"})
r.post_with_headers({"key1": "value1", "key2": "value2"})
    

POST FILE with Basic Authentication:

    
r = ProxyRequestsBasicAuth("url here", "username", "password")
r.set_file({'file': open('test.txt', 'rb')})
r.post_file()
    

POST FILE with headers & Basic Authentication:

    
h = {"User-Agent": "NCSA Mosaic/3.0 (Windows 95)"}
r = ProxyRequestsBasicAuth("url here", "username", "password")
r.set_headers(h)
r.set_file({'file': open('test.txt', 'rb')})
r.post_file_with_headers()
    



Response Methods

Returns a string:
print(r)
Or if you want the raw content as bytes:
r.get_raw()
Get the response as JSON (if valid JSON):
r.get_json()
Get the response headers:
print(r.get_headers())
Get the status code:
print(r.get_status_code())
Get the proxy that was used to make the request:
print(r.get_proxy_used())

To write raw data a to a file (including an image):

    

url = 'https://www.restwords.com/static/ICON.png'
r = ProxyRequests(url)
r.get()
with open('out.png', 'wb') as f:
    f.write(r.get_raw())

    

Dump the response to a file as JSON:
    
import json
with open('test.txt', 'w') as file_out:
    json.dump(r.get_json(), f)
    


This was developed on Ubuntu 16.04.4 LTS.
Author: James Loye Colley 04AUG2018


example1


example1


About

a class that uses scraped proxies to make an http GET/POST request (Python requests)

License:MIT License


Languages

Language:Python 100.0%