abecadel / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

Home Page:https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This repository is not active

About

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html

License:BSD 3-Clause "New" or "Revised" License


Languages

Language:Python 51.2%Language:Shell 24.5%Language:C++ 19.9%Language:CMake 1.5%Language:Java 1.4%Language:Roff 1.0%Language:Smarty 0.4%Language:Dockerfile 0.0%