Daniel Marcelino (dmarcelinobr)

dmarcelinobr

Geek Repo

Company:https://jota.info

Location:Brazil

Home Page:https://www.linkedin.com/in/dmarcelinobr

Twitter:@dmarcelinobr

Github PK Tool:Github PK Tool

Daniel Marcelino's repositories

SciencesPo

A tool set for analyzing political science data

SoundexBR

Soundex (Phonetic) Algorithm for Portuguese Spelling Strings

Language:RStargazers:13Issues:1Issues:0

Jurimetrics

Can we ever hope to predict Supreme Court decisions before they happen?

r-code-examples

A bunch of r code

Language:RStargazers:5Issues:1Issues:0

Legismetrics

Can we ever hope to predict Legislative lawmaking activity before they happen?

rDOU

Pacote R para Ler e Analisar o Diário Oficial da União (DOU)

Stargazers:2Issues:0Issues:0

SenadoBR

A local R package for accessing the Brazilian Senate API

Language:HTMLStargazers:2Issues:1Issues:0
Language:CSSStargazers:1Issues:0Issues:0

FPTP2AV

Simulate Election Outcomes for AV Given FPTP Votes.

Language:RStargazers:1Issues:0Issues:0

ggdecor

Improve your data visualisation with ggplot2 in 1 minute

Language:RStargazers:1Issues:1Issues:0

Pandemias

A tiny and small program to crawler and analyze outbreak of COVID-19 in world and every country using R

Language:HTMLStargazers:1Issues:0Issues:0

rCEP

Repo for the Brazilian Postal Codes (CEP)

Stargazers:1Issues:0Issues:0

rTSE

Pacote em R para baixar e tratar bases de dados do TSE

Language:RStargazers:1Issues:0Issues:0

tidytuesday

Official repo for the #tidytuesday project

License:MITStargazers:1Issues:0Issues:0

2019-Candidatos-Laranjas

Files for replication of "Extreme non-viable candidates and quota maneuvering in Brazilian legislative elections"

Language:RStargazers:0Issues:0Issues:0

causality

In which I play with the ideas surrounding causality

License:MITStargazers:0Issues:0Issues:0

data-science-book

A practitioner of data science

Stargazers:0Issues:1Issues:0

geobr

Easy access to official spatial data sets of Brazil in R and Python

Language:RStargazers:0Issues:0Issues:0

partyfactsdata

Party Facts data import

Language:RStargazers:0Issues:0Issues:0
Language:RStargazers:0Issues:0Issues:0

rBayesianOptimization

Bayesian Optimization of Hyperparameters

Stargazers:0Issues:0Issues:0

scan

#!/usr/bin/python3 # Code by PosiX from urllib.request import Request, urlopen from urllib.error import URLError, HTTPError import argparse import sys import time global starttime class ZeroScann(): def __init__(self): self.scan() def scan(self): # argument parser like shit parser = argparse.ArgumentParser(prog="PosiX.py", description="Simple Find Shell in Website") parser.add_argument("-u", dest="domain", help="your url") parser.add_argument("-w", dest="wordlist", help="your wordlsit") args = parser.parse_args() if not args.domain: sys.exit("\033[36musage: shell.py -u example.com -w wordlist.txt") if not args.wordlist: sys.exit("\033[36musage: shell.py -u example.com -w wordlist.txt") # handle url website format site = args.domain print("\033[96m[?] \033[0mStart Crawling...") print("\033[96m[!] \033[0mWait a sec!","\n") time.sleep(3) if not site.startswith("http://"): site = "http://"+site if not site.endswith("/"): site = site+"/" # load wordlist try: pathlist = args.wordlist wlist = open(pathlist, "r") wordlist = wlist.readlines() except FileNotFound as e: print("\033[91mUpss, Wordlist Not Found!\033[0m") exit() finally: try: wlist.close() except: print("\033[91mWordlist Can\'t Close!\033[0m") # user-agent user_agent = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.47 Safari/537.36" #list to hold the results we find found = [] # respon code resp_codes = {403 : "403 forbidden", 401 : "401 unauthorized"} # loop with join pathlist starttime = time.time() for psx in wordlist: try: psx = psx.replace("\n", "") url = site+psx req = Request(url, headers={"User-Agent": user_agent}) time.sleep(0.1) try: connection = urlopen(req) print("\033[96m[\033[90m{0}\033[96m]".format(time.strftime("%H:%M:%S")),"\033[92mfound:","\033[0m/"+psx) found.append(url) except HTTPError as e: if e.code == 404: print("\033[96m[\033[90m{0}\033[96m]".format(time.strftime("%H:%M:%S")),"\033[91merror:","\033[0m/"+psx) else: print("\033[96m[\033[90m{0}\033[96m]".format(time.strftime("%H:%M:%S")),"\033[92minfo :","\033[33m/"+psx,"\033[92mstatus:\033[33m",resp_codes[e.code]) except URLError as e: sys.exit("\033[31m[!] Upss, No Internet Connection") except Exception as er: print("\n\033[93m[?] \033[0mYour Connection Is Bad") print("\033[93m[!] \033[0mExit Program") time.sleep(3) exit() except KeyboardInterrupt as e: print("\n\033[96m[?] \033[0mCTRL+C Detected") print("\033[96m[!] \033[0mExit Program") time.sleep(2) exit() if found: print("\n\033[96m[+] \033[0mResult Found\033[92m") print("\n".join(found)) print("\033[96m[?] \033[0mTime Elasped: \033[35m%.2f\033[0m Seconds" % float(time.time()-starttime)) else: print("\n\033[96m[!] \033[0mCould Not Find Any Shell Backdoor") print("\033[96m[?] \033[0mTime Elasped: \033[33m%.2f\033[0m Seconds" % float(time.time()-starttime)) def banner(): # just the screen display like this info = """\033[33m ___ .' '. : : | _ _ | .-.|(\033[91m0\033[93m)_(\033[91m0\033[93m)|.-. ( ( | .--. | ) ) '-/ ( ) \-' / '--' \\ \ `\033[91m"===="\033[93m` / '\ /' '\ /' _/'-.-'\_ _..:;\._/v\_./:;.._ .'/;:;:;\ /^\ /;:;:;\\'. / /;:;:;:;\| |/:;:;:;:\ \\ / /;:;:;:;:;\_/:;:;:;:;:\ \\ \033[91m # ================================== # # Shell Scanner # # Code by PosiX #\033[0m # Twitter: @posiX # # http://maqlo-heker.blogspot.com # # ================================== # """ return info print(banner()) if __name__ == '__main__': ZeroScann()

Stargazers:0Issues:0Issues:0

shap

A game theoretic approach to explain the output of any machine learning model.

License:MITStargazers:0Issues:0Issues:0