rivers's starred repositories

N_m3u8DL-CLI

[.NET] m3u8 downloader 开源的命令行m3u8/HLS/dash下载器,支持普通AES-128-CBC解密,多线程,自定义请求头等. 支持简体中文,繁体中文和英文. English Supported.

Language:C#License:MITStargazers:13901Issues:0Issues:0

tvm_mlir_learn

compiler learning resources collect.

Language:PythonStargazers:1960Issues:0Issues:0

AISystem

AISystem 主要是指AI系统,包括AI芯片、AI编译器、AI推理和训练框架等AI全栈底层技术

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:9645Issues:0Issues:0

cuml

cuML - RAPIDS Machine Learning Library

Language:C++License:Apache-2.0Stargazers:4062Issues:0Issues:0

vega

AutoML tools chain

Language:PythonLicense:NOASSERTIONStargazers:841Issues:0Issues:0

optimizer

Actively maintained ONNX Optimizer

Language:C++License:Apache-2.0Stargazers:619Issues:0Issues:0

tinyengine

[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256KB Memory

Language:CLicense:MITStargazers:766Issues:0Issues:0

CMSIS-NN

CMSIS-NN Library

Language:CLicense:Apache-2.0Stargazers:172Issues:0Issues:0

onnx-simplifier

Simplify your onnx model

Language:C++License:Apache-2.0Stargazers:3693Issues:0Issues:0

AI-Infer-Engine-From-Zero

关于自建AI推理引擎的手册,从0开始你需要知道的所有事情

Stargazers:240Issues:0Issues:0

nn-Meter

A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.

Language:PythonLicense:MITStargazers:327Issues:0Issues:0

ruff

An extremely fast Python linter and code formatter, written in Rust.

Language:RustLicense:MITStargazers:29186Issues:0Issues:0

TPAT

TensorRT Plugin Autogen Tool

Language:PythonLicense:Apache-2.0Stargazers:364Issues:0Issues:0

NART

NART = NART is not A RunTime, a deep learning inference framework.

Language:PythonLicense:Apache-2.0Stargazers:37Issues:0Issues:0