Tech-with-Vidhya

Tech-with-Vidhya

Geek Repo

Company:Software Lead/Consultant, Data Scientist & ML/Data Engineer

Location:Queen Mary University of London, UK

Home Page:https://github.com/Tech-with-Vidhya

Github PK Tool:Github PK Tool

Tech-with-Vidhya's repositories

productionized_docker_ML_model_application_into_kubernetes_cluster_using_AWS_EKS_CloudFormation_EMR

This project covers the end to end implementation of deploying and productionizing a dockerized/containerized machine learning python flask application into Kubernetes Cluster using the AWS Elastic Kubernetes Service (EKS), AWS Serverless Fargate Instances, AWS CloudFormation Cloud Stack and AWS Elastic Container Registry (ECR) Service. The machine learning business case implemented in this project includes a bank note authentication binary classifier model using Random Forest Classifier; which predicts and classifies a bank note either as a Fake Bank Note (Label 0) or a Genuine Bank Note (Label 1). Implementation Steps: 1. Creation of an end to end machine learning solution covering all the ML life-cycle steps of Data Exploration, Feature Selection, Model Training, Model Validation and Model Testing on the unseen production data. 2. Saved the finalised model as a pickle file. 3. Creation of a Python Flask based API; in order to render the ML model solution and inferences to the end-users. 4. Verified and tested the working status of the Python Flask API in the localhost set-up. 5. Creation of a Docker File (containing the steps/instructions to create a docker image) for the Python Flask based Bank Note Authentication Machine Learning Application embedded with Random Forest ML Classifier Model. 6. Creation of IAM Service Roles with appropriate policies to access the AWS Elastic Container Registry (ECR) Service and AWS Elastic Kubernetes Service (EKS) and AWS CloudFormation Service. 7. Created a new EC2 Linux Server Instance in AWS and copied the web application project’s directories and its files into the AWS Linux Server using SFTP linux commands. 8. Installed the Docker software and the supporting python libraries in the AWS EC2 Linux Server Instance; as per the “requirements.txt” file. 9. Transformation of the Docker File into a Docker Image and Docker Container representing the application; using docker build and run commands. 10. Creation of a Docker Repository within the AWS ECR Service and pushed the application docker image into the repository using AWS Command Line Interface (CLI) commands. 11. Creation of the Cloud Stack with private and public subnets using the AWS CloudFormation Service with appropriate IAM roles and policies. 12. Creation of the Kubernetes Cluster using the AWS EKS Service with appropriate IAM roles and policies and linked the cloud stack created using the AWS CloudFormation Service. 13. Creation of the AWS Serverless Fargate Profile and Fargate instances/nodes. 14. Creation and configured the “Deployment.yaml” and “Service.yaml” files using the Kubernetes kubectl commands. 15. Applied the “Deployment.yaml” with pods replica configuration to the AWS EKS Cluster Fargate Nodes; using the Kubernetes kubectl commands. 16. Applied the “Service.yaml” using the Kubernetes kubectl commands; to render and service the machine learning application to the end-users for public access with the creation of the production end-point. 17. Verified and tested the inferences of the productionized ML Application using the AWS Fargate end-point created in the AWS Kubernetes EKS Cluster. Tools & Technologies: Python, Flask, AWS, AWS EC2, Linux Server, Linux Commands, Command Line Interface (CLI), Docker, Docker Commands, AWS ECR, AWS IAM, AWS CloudFormation, AWS EKS, Kubernetes, Kubernetes kubectl Commands.

Language:Jupyter NotebookStargazers:6Issues:1Issues:0

credit-risk-assessment-fintech-framework-using-deep-learning-and-transfer-learning

This project represents the credit risk assessment dual framework of predicting credit scores and the forecasts of credit default risk of the consumers of the financial institutions like commercial banks and lending firms. The implementation is dealt that mimics the real-world FICO Scoring Model with the custom enhancements to include lender's internal credit risk factors by proposing a new Domain-Tech Feature Selection Approach along with Deep Learning and Transfer Learning techniques. This is the masters final project delivered as part of my course of studying Masters in Big Data Science program at Queen Mary University of London (QMUL), United Kingdom.

Language:Jupyter NotebookStargazers:5Issues:1Issues:0

Bitcoin_Network_Analytics_using_Python_NetworkX_and_Gephi

This group project of 4 members is delivered as part of my Masters in Big Data Science (MSc BDS) Program Module named “Digital Media and Social Network” in Queen Mary University of London (QMUL), London, United Kingdom. This project covers the network analysis covering 4 different problem statements and use cases using python NetworkX package, Gephi network analysis tool and Microsoft excel. Dataset: Dataset includes Bitcoin Trade Transactions for the period between 2011 to 2016. Dataset Representation: Bitcoin Trade Transactions -> Attributes (Rater, Ratee, Rating and Timestamp) Network Formation: For every trade transaction between 2 users in the Bitcoin Network; ratings are recorded and tracked in the system with the corresponding timestamp (Directed Network). Size of the Dataset and Network: Users/Nodes = 5881 Transactions/Edges = 35592 Ratings (in the range of -10 to +10; where -10 represents the least rating and +10 represents the highest rating) Basic Network Statistics: Use Cases and Objectives:

Language:Jupyter NotebookStargazers:4Issues:1Issues:0

productionized_docker_ML_model_application_into_AWS_EC2_Linux

This project covers the end to end implementation of deploying and productionizing a dockerized/containerized machine learning python flask application into AWS Elastic Compute Cloud (EC2) Instance and AWS Elastic Container Registry (ECR) Service. The machine learning business case implemented in this project includes a bank note authentication binary classifier model using Random Forest Classifier; which predicts and classifies a bank note either as a Fake Bank Note (Label 0) or a Genuine Bank Note (Label 1). The implementation includes below steps: 1. Creation of an end to end machine learning solution covering all the ML life-cycle steps of Data Exploration, Feature Selection, Model Training, Model Validation and Model Testing on the unseen production data. 2. Saved the finalised model as a pickle file. 3. Creation of a Python Flask based API; in order to render the ML model solution and inferences to the end-users. 4. Verified and tested the working status of the Python Flask API in the localhost set-up. 5. Creation of a Docker File (containing the steps/instructions to create a docker image) for the Python Flask based Bank Note Authentication Machine Learning Application embedded with Random Forest ML Classifier Model. 6. Creation of IAM Service Roles with appropriate policies to access the AWS Elastic Container Registry (ECR) Service and AWS Elastic Compute Cloud (EC2) Service. 7. Created a new EC2 Linux Server Instance in AWS and copied the web application project’s directories and its files into the AWS Linux Server using SFTP linux commands. 8. Installed the Docker software and the supporting python libraries in the AWS EC2 Linux Server Instance; as per the “requirements.txt” file. 9. Transformation of the Docker File into a Docker Image and Docker Container representing the application; using docker build and run commands. 10. Creation of a Docker Repository within the AWS ECR Service and pushed the application docker image into the repository using AWS Command Line Interface (CLI) commands. 11. Deployment of the dockerized/containerized Python Flask ML application into the AWS EC2 Linux Instance; with the creation of the production end-point. 12. Verified and tested the inferences of the productionized ML Application using the AWS EC2 end-point. Tools & Technologies: Python, Flask, AWS, AWS EC2, Linux Server, Linux Commands, Command Line Interface (CLI), Docker, Docker Commands, AWS ECR, AWS IAM

Language:Jupyter NotebookStargazers:4Issues:1Issues:0

MLOps_AWS_LoadBalancing_Docker_Flask_Terraform_Banking_Customers_Churn_Prediction_Ensemble_Technique

This is an AWS MLE and MLOps Bank Customers Churn Prediction Project.

Language:Jupyter NotebookStargazers:3Issues:1Issues:0

dockerizing_credit_risk_assessment_python_flask_web_app_ML_models_deployment_AWS_ECR_ECS_Fargate_EC2

This project covers the implementation of dockerizing a python flask based credit risk assessment calculator web application integrated with two different deep learning and transfer learning based ML models; using Amazon AWS Elastic Container Registry (ECR), AWS Elastic Container Service (ECS) and deployed into both AWS Fargate Cluster and EC2 Instance/Cluster. Calculating a 3-digit credit score of an individual and the percentage of probability of default of the individual are the outcomes of the 2 ML models deployed. The implementation includes below steps: 1. Creation of a Docker File for the Python Flask Based Credit Risk Assessment Web Application with 2 Deep Learning Models 2. Created a new EC2 Ubuntu Server Instance in AWS and copied the web application project’s directories and files into the AWS Ubuntu Server using SFTP linux commands. 3. Transformation of the Docker File into a Docker Image 4. Creation of a Docker Repository in AWS using AWS ECR Service 5. Authentication the Docker User Login Credentials with AWS using AWS Command Line Interface (CLI) 6. Pushed the Web Application’s Docker Image in to AWS ECR 7. Creation of the Docker Container in the AWS using AWS ECS Service 8. Creation of the Task Definition in the AWS ECS Service linked to the Docker Container 9. Configured the ECS Service Definition, by denoting the replicas of the task definitions to be executed. Enabled Load Balancer feature to manage the incoming load of the web application’s requests and traffic into the AWS Cluster. 10. Configured the AWS Fargate Cluster to execute the service and the tasks; and deployed the docker based web application into AWS Fargate Cluster. 11. Alternately; created and configured the AWS EC2 Instance/Cluster. Created Identity and Access Management (IAM) user with role and policies. Executed the ECS tasks; and deployed the docker based web application into AWS EC2 Instance. Tools & Technologies: Python, Flask, HTML, AWS, EC2, Ubuntu Server, Linux Commands, Command Line Interface (CLI), Docker, ECR, ECS, Fargate, IAM

Language:HTMLStargazers:2Issues:1Issues:0

AWS_SageMaker_TensorFlow_Keras_CNN_Model_Fashion_MNIST

This is an AWS SageMaker TensorFlow Keras CNN Machine Learning Project.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

CC-Flask-CMS-API

This project repository includes a Web-based Content (Articles) Management System/Application related to Data Science Learning and Career Journey with User Registration, Login functionalities using Python, Flask Web Framework, HTML, PostgreSQL database and Heroku Cloud Server. This application is implemented and deployed in Heroku Cloud Server.

Language:HTMLStargazers:1Issues:1Issues:0

Coursera-Deep-Learning-Specialization-2021

Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks;

Stargazers:1Issues:0Issues:0

Coursera-Deep-Learning-Specialization-2023

Contains Solutions to Deep Learning Specailization - Coursera

License:MITStargazers:1Issues:0Issues:0

Coursera-Machine-Learning-Specialization-2023

Contains Solutions and Notes for the Machine Learning Specialization By Stanford University and Deeplearning.ai - Coursera (2022) by Prof. Andrew NG

License:MITStargazers:1Issues:0Issues:0

MLOps_AWS_Docker_Gunicorn_Flask_NLP_LDA_Topic_Modeling_sklearn_Framework

This is an AWS MLE and MLOps NLP LDA Topic Modeling Project.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

MLOps_AWS_Kubernetes_LoadBalancing_Docker_Flask_Banking_Customers_Digital_Transformation_Classifier

This is an AWS MLE and MLOps Bank Customers Digital Transformation Project.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

MLOps_AWS_Lightsail_Docker_Flask_ARCH_GARCH_Time_Series_Modeling_Statistical_Framework

This is an AWS MLE and MLOps ARCH and GARCH Time Series Forecasting Statistical Modeling Project.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

MLOps_AWS_Lightsail_Docker_Flask_Gaussian_Based_Time_Series_Modeling_Framework

This is an AWS MLE and MLOps Time Series Forecasting Modeling Project.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

MLOps_AWS_Lightsail_Docker_Flask_Multi-Linear_Regression_Time_Series_Modeling_sklearn_Framework

This is an AWS MLE and MLOps Time Series Forecasting Project using Multiple Linear Regression Model.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0
Language:PythonLicense:CC0-1.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:CC0-1.0Stargazers:0Issues:0Issues:0

cli-demo

Public resources for Databricks CLI demo

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

copilot-codespaces-vscode

Develop with AI-powered code suggestions using GitHub Copilot and VS Code

License:MITStargazers:0Issues:0Issues:0

Coursera-Deep-Learning-Specialization-2021-Other

This repo contains the updated version of all the assignments/labs (done by me) of Deep Learning Specialization on Coursera by Andrew Ng. It includes building various deep learning models from scratch and implementing them for object detection, facial recognition, autonomous driving, neural machine translation, trigger word detection, etc.

License:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:CC0-1.0Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:1Issues:0
Language:Jupyter NotebookStargazers:0Issues:1Issues:0