roychou121 / triton-inference-server

The Triton Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This repository is not active

About

The Triton Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.

License:BSD 3-Clause "New" or "Revised" License


Languages

Language:C++ 57.7%Language:Python 28.2%Language:Shell 9.6%Language:CMake 2.6%Language:Roff 0.6%Language:Cuda 0.5%Language:Dockerfile 0.4%Language:C 0.2%Language:Go 0.2%Language:Smarty 0.1%