zineos / hidet

An open-source efficient deep learning framework.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hidet: A compilation-based deep learning framework

Documentation

GitHub GitHub Workflow Status

Hidet is an open-source DNN inference framework based on compilation. It supports end-to-end compilation of DNN models from PyTorch and ONNX to efficient cuda kernels. A series of graph-level and operator-level optimizations are applied to optimize the performance.

Getting Started

Installation

pip install hidet

See here for building from source.

Usage

Optimize a PyTorch model through hidet (require PyTorch 2.0):

import torch
import hidet

# Register hidet backends for pytorch dynamo, can be omitted if you import torch before hidet
hidet.torch.register_dynamo_backends()  

# Define pytorch model
model = torch.hub.load('pytorch/vision:v0.6.0', 'resnet18', pretrained=True).cuda().eval()
x = torch.rand(1, 3, 224, 224).cuda()

# Compile the model through Hidet
model_opt = torch.compile(model, backend='hidet')  

# Run the optimized model
y = model_opt(x)

See the following tutorials to learn other usgae:

Publication

Hidet originates from the following research work. If you used Hidet in your research, welcome to cite our paper.

  • Hidet: Task-Mapping Programming Paradigm for Deep Learning Tensor Programs.
    Yaoyao Ding, Cody Hao Yu, Bojian Zheng, Yizhi Liu, Yida Wang, and Gennady Pekhimenko.

Development

Hidet is currently under active development by a team at CentML Inc.

Contributing

We welcome contributions from the community. Please see contribution guide for more details.

License

Hidet is released under the Apache 2.0 license.

About

An open-source efficient deep learning framework.

License:Apache License 2.0


Languages

Language:Python 98.6%Language:C++ 0.8%Language:C 0.3%Language:Shell 0.2%Language:CMake 0.1%Language:Dockerfile 0.0%