makaveli10 / cpptensorrtz

Convert popular Deep learning models to TensorRT using C++ API (preferably)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

cpptensorrtz

Index

1. Getting Started

Using docker (Preferred)

Note: This method is tested using Ubuntu Distros. This method doesn't work for Windows/MacOS as nvidia-docker isn't supported.

  1. Install docker. (Skip this step if docker is already installed)
 $ sudo apt-get update
 $ sudo apt-get install docker-ce docker-ce-cli containerd.io
  1. Install nvidia-container-toolkit to use gpus with docker. Reference - nvidia-docker
 $ distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
 $ curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -
 $ curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list
 $ sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit
 $ sudo systemctl restart docker
  1. Pull TensorRT image from NVIDIA GPU cloud.
 $ docker pull nvcr.io/nvidia/tensorrt:20.12-py3

2. Resources

  1. Docker quick-start
  2. TensorRT Developer Guide
  3. TensorRT NVIDIA GPU Cloud
  4. wang-xinyu/tensorrtx. Thanks to this repo for all the resources and inspiration.

About

Convert popular Deep learning models to TensorRT using C++ API (preferably)


Languages

Language:C++ 97.4%Language:CMake 2.6%