judeleonard / Machine-learning-model-microservice

Machine learning model deployed as a microservice through FastAPI and Docker. Access the prediction endpoint anywhere by sending a request over https protocol.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Machine-learning-model-microservice

Machine l earning model deployed as a microservice through FastAPI and Docker. Access the prediction endpoint anywhere by sending a request over https protocol.

Background of the Model

The data used for developing this model was downloaded from UCI machine learning repository website.The dataset contains bank customer's information and the goal is to use this data to train a machine learning model that will help them to profile potential clients who are most likely to open a term deposite with the bank, in this way they can channel their marketing resources towards these customers to hasten this process. However, I was not entirely focused on developing a robust model but rather testing out this API, although the performance is not so bad for a less trained model with 88% accuracy. You can find the notebook I used for the training here

How the API works

The API predict endpoint recieves a json object that contains a customer's information then outputs a prediction label either Yes or No. Where yes means the customer is most likely to open a term deposit with the bank and No for the opposite, alongside the probability for each of this prediction.

Testing the API via the fastapi doc

Testing the API by calling it externally over https

Supply a Json object containing data

    customer_data =  {
                  "age": 30,
                  "job": "unemployed",
                  "marital": "married",
                  "education": "primary",
                  "default": "no",
                  "balance": 1787,
                  "housing": "no",
                  "loan": "no",
                  "day": 19,
                  "duration": 79,
                  "campaign": 1,
                  "pdays": -1,
                  "previous": 0,
                  "poutcome": "unkown"
                }

Make post request to predict end point

        import requests
        response = requests.post('https://ml-model-ms.herokuapp.com/predict', json=customer_data)
        print(response.content)

prediction output

    b'{"prediction":["No"],"probability":0.6121217836623571}'

See this repository for how this model microservice was integrated and consuming data via a data polling system

Other things we can try

There are cases the model might be so large that we cannot have that uploaded to a respository or perhaps we need additional security. We can have this model uploaded to any cloud storage like GCP or S3 bucket, and we can call this model service using the generated access credentials. Feel free to test this api or ask me any question, would be glad to help.

About

Machine learning model deployed as a microservice through FastAPI and Docker. Access the prediction endpoint anywhere by sending a request over https protocol.


Languages

Language:Python 84.7%Language:Dockerfile 15.3%