omar16100 / aws-lambda-docker-serverless-inference

Serve scikit-learn, XGBoost, TensorFlow, and PyTorch models with AWS Lambda container images support.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pay as you go inference with AWS Lambda (Container Image Support)

AWS ML AWS Lambda Docker

This repository contains resources to help you deploy Lambda functions based on Python and Java Docker Images.

The applications deployed illustrate how to perform inference for scikit-learn, XGBoost, TensorFlow and PyTorch models using Lambda Function.

Overview

AWS Lambda is one of the most cost effective service that lets you run code without provisioning or managing servers.

It offers many advantages when working with serverless infrastructure. When you break down the logic of your machine learning service into a single Lambda function for a single request, things become much simpler and easy to scale.

You can forget all about the resource handling needed for the parallel requests coming into your model.

If your usage is sparse and tolerable to a higher latency, Lambda is a great choice among various solutions.

Repository Structure

The repository contains the following resources:

Installation Instructions

  1. Create an AWS account if you do not already have one and login.

  2. Install Docker Desktop

  3. Install the AWS CLI and Configure AWS credentials.

  4. Clone the repo onto your local development machine using git clone.

  5. Open the project in any IDE of your choice in order to run the example Python and Java files.

  6. Follow the instructions in each of the example README.md file.

Questions?

Please contact @e_sela or raise an issue on this repo.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

About

Serve scikit-learn, XGBoost, TensorFlow, and PyTorch models with AWS Lambda container images support.

License:MIT No Attribution


Languages

Language:Jupyter Notebook 65.7%Language:Java 16.3%Language:Python 13.5%Language:Dockerfile 4.5%