JackLongKing / tensorRTIntegrate

TensorRT ONNX Plugin、Inference、Compile

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TensorRT-Integrate

  1. Support pytorch onnx plugin(DCN、HSwish ... etc.)
  2. Simpler inference and plugin APIs

Re-implement

image1


coco.tracking.jpg


selfie.draw.jpg

Use TensorRT-Integrate

install protobuf >= v3.8.x

bash getDLADCN.sh
make run -j32

Inference Code

auto engine = TRTInfer::loadEngine("models/efficientnet-b0.fp32.trtmodel");
float mean[3] = {0.485, 0.456, 0.406};
float std[3] = {0.229, 0.224, 0.225};
Mat image = imread("img.jpg");
engine->input()->setNormMat(0, image, mean, std);
engine->forward();
engine->output(0)->print();

Environment


Plugin

  1. Pytorch export ONNX: plugin_onnx_export.py
  2. MReLU.cuHSwish.cuDCNv2.cu

About

TensorRT ONNX Plugin、Inference、Compile


Languages

Language:C++ 75.2%Language:C 9.7%Language:Cuda 9.1%Language:Python 5.7%Language:Makefile 0.3%Language:Shell 0.0%