Yuantao Feng's repositories
how-to-optimize-gemm-opencl
Step-by-step GEMM optimization tutorial on OpenCL GPU platforms
opencv_zoo
Model Zoo For OpenCV DNN and Benchmarks.
PyTorch-FLOPs
Only the SIZE of tensors flow for pure FLOPs calculation
pytorch_internal
Looking into the internal of PyTorch.
checkout-opencv
Action for checking out https://github.com/opencv/opencv
onnx_utils
Generate a valid onnx model using only the onnx
buff2steam
make money with buff.163.com
ci-gha-workflow
GitHub Actions workflows for OpenCV project
CLBlast
Tuned OpenCL BLAS
DALI
A library containing both highly optimized building blocks and an execution engine for data pre-processing in deep learning applications
ExtendRandomBBoxCrop
Extend `RandomBBoxCrop` from NVIDA/DALI to perform the same operations to both bounding boxes and extra data, such as bbox-related landmarks.
ficus
The programming language Ficus
HRNet-Facial-Landmark-Detection
This is an official implementation of facial landmark detection for our TPAMI paper "Deep High-Resolution Representation Learning for Visual Recognition". https://arxiv.org/abs/1908.07919
libfacedetection
An open source library for face detection in images. The face detection speed can reach 1000FPS.
libfacedetection.train
The training program for libfacedetection for face detection and 5-landmark detection.
onnx-simplifier
Simplify your onnx model
onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
opencv-gha-dockerfile
GitHub Actions Dockerfiles for OpenCV project
opencv-worker-config
OpenCV buildbot worker scripts
opencv_contrib
Repository for OpenCV's extra modules
opencv_extra
OpenCV extra data
opencv_zoo_cpp
C++ version of OpenCV Zoo (https://github.com/opencv/opencv_zoo)
optimizer
Actively maintained ONNX Optimizer
Paddle2ONNX
PaddlePaddle to ONNX model converter
segment-anything
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
Tengine
Tengine is a lite, high performance, modular inference engine for embedded device
TIM-VX
Verisilicon Tensor Interface Module
U-2-Net
The code for our newly accepted paper in Pattern Recognition 2020: "U^2-Net: Going Deeper with Nested U-Structure for Salient Object Detection."