WJC_Jr's repositories
BandLab-Bot
A BandLab-like bot, written in Python 3.12 and Selenium, logs into a BandLab account and scrolls through the BandLab feed, commenting on or liking the first post the bot comes across. Autoplay bot.
Instagram-Bot
Python3 and Selenium-based automated Instagram tool used to identify specific Instagram features by locating HTML elements and interacting with them.
-Python-Rumble-Bot-
This Python script automates browser interactions using Selenium WebDriver. It spins multiple browser instances attached to separate nodes on a binary tree.
Audio-Downloader-Website
Python and Django website that allows users to download audio files. Users can easily navigate the platform and download audio samples including drum patterns, bass loops and full beats as mp3 files.
Authentication-Bot
Automated authentication bot in Java 17 using Selenium with the Firefox GeckoDriver. It creates a new browser instance and locates each element by XPath.
Binance-Web-Scraper
Binance Spider: built using Scrapy, employs an ETL process that extracts crypto data from Binance's price page, organizes it using Scrapy items, and stores it in JSON format, for analysis and integration into a PostgreSQL database.
Coffee-Shop-Website
Coffee Shop website in Python & Django
Discord-Bot
Discord chatbot that provides definitions from Webster's dictionary in public and private discord channels. Written in Python3 with Discord's API. Dataset in JSON. Dictionary Bot - "Define"
Generate_Binary_Matrix
Generating a binary matrix using the binary representation of decimal numbers for each row. Binary Division.
Gesture-Tracking-opencv-python
Adjusts the volume by changing the distance between landmark[4] and landmark[8]; your pointer finger and thumb. This script utilizes the osascript, opencv, cvzone, math, and os modules. The bounding box surrounds the hand after being detected by the camera and monitors the change in volume over time. (Latest push has a PyQt6 GUI attatched the App)
Parallelized_File_Converter
The program extracts CSV data from a multiline string, transforms it into JSON using a Pandas DataFrame (the transformation process is parallelized), then loads the parallelized JSON data to the current directory. It maps the ETL process to a number of threads to achieve parallelism at runtime then instantiates and executes the program.
Portfolio
With more than three years of experience as a certified Python developer, I have created web applications, data pipelines, and automated tools that address some of the modern challenges encountered while interacting with the internet. Currently, I am responsible for testing mobile and web prototypes for various software testing platforms.
PyQt6-AudioCompressor-GUI
Fully-Responsive GUI audio plugin written in pyqt6. Utilizes multiple QtWidgets with GUI logic written in python3.
Real-Estate-WebCrawler
A Real Estate Web Crawler and data pipeline, developed using Python and Scrapy, facilitates the ETL process through multiple stages. It extracts metadata for apartments in Milan, Italy, from various web pages and URLs on sublet.com. The extracted information is then structured using Scrapy items and saved in JSON format
randomPasswordGenerator
Encryption tool made from a while-loop with nested conditional statements. It also holds two for-loops that iterate over a list of specified characters
To-do-app
Full-stack To Do app written in Django and React. It performs basic CRUD functions to an sql-lite database with axios.
Two_Sum_Algo
Iterate through list of integers, add each index to its neighbor and check if the target variable is equal to the sum of any of the indecies.
URL-Enumeration-Tool
Locates various subdomains and subdirectories from an allowed total of 3 websites, and outputs them to separate CSV files.
Wikipedia-WebCrawler
Wikipedia Web Crawler written in Python and Scrapy. The ETL process involves multiple steps, extracting specific data from multiple wikipedia web pages/links using scrapy and organizing it into a structured format using scrapy items. Additionally, the extracted data is saved in JSON format for further analysis and integration into MySQL Workbench.