SarthakGarg19 / Accelerating-Inference-in-Tensorflow-using-TensorRT

TensorRT optimises any Deep Learning model by not only making it lightweight but also by accelerating its inference speed with an idea to extract every ounce of performance from the model, making it perfect to be deployed at the edge. This repository helps you convert any Deep Learning model from TensorFlow to TensorRT!

Home Page:https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

SarthakGarg19/Accelerating-Inference-in-Tensorflow-using-TensorRT Stargazers