heluocs's repositories
heluocs
Config files for my GitHub profile.
arena
A CLI for Kubeflow.
nvitop
An interactive NVIDIA-GPU process viewer, the one-stop solution for GPU process management.
Coursera-ML-AndrewNg-Notes
吴恩达老师的机器学习课程个人笔记
Awesome-pytorch-list
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
nebullvm
Easy-to-use library to boost AI inference leveraging state-of-the-art optimization techniques.
buddy-mlir
An MLIR-Based Ideas Landing Project
trt-samples-for-hackathon-cn
Simple samples for TensorRT programming
TPAT
TensorRT Plugin Autogen Tool
torch-mlir
The Torch-MLIR project aims to provide first class support from the PyTorch ecosystem to the MLIR ecosystem.
BladeDISC
BladeDISC is an end-to-end DynamIc Shape Compiler project for machine learning workloads.
Torch-TensorRT
PyTorch/TorchScript compiler for NVIDIA GPUs using TensorRT
mpi-operator
Repository for the MPI operator.
tiny-tensorrt
Deploy your model with TensorRT quickly. 快速使用TensorRT来部署模型
tensorrt
TensorFlow/TensorRT integration
AnimeGANv2
[Open Source]. The improved version of AnimeGAN. Landscape photos/videos to anime
animegan2-pytorch
PyTorch implementation of AnimeGANv2
triton_transformers
Deploy optimized transformer based models on Nvidia Triton server
learnopencv
Learn OpenCV : C++ and Python Examples
ai-matrix
To make it easy to benchmark AI accelerators
nimble
Lightweight and Parallel Deep Learning Framework
gpu-monitoring-tools
Tools for monitoring NVIDIA GPUs on Linux
serving
A flexible, high-performance serving system for machine learning models
ParlAI
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
Forward
A library for high performance deep learning inference on NVIDIA GPUs.
tensorrt-plugin-demo
Simple demo of tensorrt plugin
alien
alien is a GPU-accelerated artificial life simulation program.
onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
ncnn
ncnn is a high-performance neural network inference framework optimized for the mobile platform