There are 0 repository under tensorrt-inference topic.
Deep Learning API and Server in C++14 support for Caffe, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
FastFlowNet: A Lightweight Network for Fast Optical Flow Estimation (ICRA 2021)
Hardware-accelerated DNN model inference ROS2 packages using NVIDIA Triton/TensorRT for both Jetson and x86_64 with CUDA-capable GPU
you can use dbnet to detect word or bar code,Knowledge Distillation is provided,also python tensorrt inference is provided.
Yolov5 TensorRT Implementations
this is a tensorrt version unet, inspired by tensorrtx
Using TensorRT for Inference Model Deployment.
An oriented object detection framework based on TensorRT
Advance inference performance using TensorRT for CRAFT Text detection. Implemented modules to convert Pytorch -> ONNX -> TensorRT, with dynamic shapes (multi-size input) inference.
Export (from Onnx) and Inference TensorRT engine with C++.
Export (from Onnx) and Inference TensorRT engine with Python
This project is a notebook of learning TensorRT.