tokk-nv / jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

a header for a software project about building containers for AI and machine learning

Machine Learning Containers for Jetson and JetPack

l4t-pytorch l4t-tensorflow l4t-ml l4t-diffusion l4t-text-generation

Modular container build system that provides various AI/ML packages for NVIDIA Jetson 🚀🤖

ML pytorch tensorflow onnxruntime deepstream tritonserver jupyterlab stable-diffusion
LLM transformers text-generation-webui text-generation-inference llava llama.cpp exllama llamaspeak awq AutoGPTQ MiniGPT-4 MLC langchain optimum bitsandbytes nemo riva
L4T l4t-pytorch l4t-tensorflow l4t-ml l4t-diffusion l4t-text-generation
VIT NanoOWL NanoSAM Segment Anything (SAM) Track Anything (TAM)
CUDA cupy cuda-python pycuda numba cudf cuml
Robotics ros ros2 opencv:cuda realsense zed
VectorDB NanoDB FAISS RAFT

See the packages directory for the full list, including pre-built container images and CI/CD status for JetPack/L4T.

Using the included tools, you can easily combine packages together for building your own containers. Want to run ROS2 with PyTorch and Transformers? No problem - just do the system setup, and build it on your Jetson like this:

$ ./build.sh --name=my_container ros:humble-desktop pytorch transformers

There are shortcuts for running containers too - this will pull or build a l4t-pytorch image that's compatible:

$ ./run.sh $(./autotag l4t-pytorch)

run.sh forwards arguments to docker run with some defaults added (like --runtime nvidia, mounts a /data cache, and detects devices)
autotag finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it.

If you look at any package's readme (like l4t-pytorch), it will have detailed instructions for running it's container.

Documentation

Check out the tutorials at the Jetson Generative AI Lab!

Getting Started

Refer to the System Setup page for tips about setting up your Docker daemon and memory/storage tuning.

sudo apt-get update && sudo apt-get install git python3-pip
git clone --depth=1 https://github.com/dusty-nv/jetson-containers
cd jetson-containers
pip3 install -r requirements.txt
./run.sh $(./autotag l4t-pytorch)

Or you can manually run a container image of your choice without using the helper scripts above:

sudo docker run --runtime nvidia -it --rm --network=host dustynv/l4t-pytorch:r35.4.1

Looking for the old jetson-containers? See the legacy branch.

About

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T

License:MIT License


Languages

Language:Python 73.6%Language:Dockerfile 12.4%Language:JavaScript 4.6%Language:Shell 3.8%Language:HTML 2.7%Language:Cuda 1.4%Language:Jupyter Notebook 1.0%Language:CSS 0.4%Language:C++ 0.1%Language:CMake 0.1%