Hongyu Wang's repositories
awesome-detection-transformer
Collect some papers about transformer for detection and segmentation. Awesome Detection Transformer for Computer Vision (CV)
ControlNet
Let us control diffusion models
deep-learning-for-image-processing
deep learning for image processing including classification and object-detection etc.
detr
End-to-End Object Detection with Transformers
detrex
detrex is a research platform for Transformer-based Instance Recognition algorithms including DETR (ECCV 2020), Deformable-DETR (ICLR 2021), Conditional-DETR (ICCV 2021), DAB-DETR (ICLR 2022), DN-DETR (CVPR 2022), DINO (ICLR 2023), H-DETR (CVPR 2023), MaskDINO (CVPR 2023), etc.
evals
Evals is a framework for evaluating OpenAI models and an open-source registry of benchmarks.
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
flash-attention
Fast and memory-efficient exact attention
FlexGen
Running large language models like OPT-175B/GPT-3 on a single GPU. Focusing on high-throughput large-batch generation.
LAVIS
LAVIS - A One-stop Library for Language-Vision Intelligence
mae
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
MT-Reading-List
A machine translation reading list maintained by Tsinghua Natural Language Processing Group
Scene-Graph-Benchmark.pytorch
A new codebase for popular Scene Graph Generation methods (2020). Visualization & Scene Graph Extraction on custom images/datasets are provided. It's also a PyTorch implementation of paper “Unbiased Scene Graph Generation from Biased Training CVPR 2020”
torchscale
Transformers at any scale
gpt-fast
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
Grounded-Segment-Anything
Marrying Grounding DINO with Segment Anything & Stable Diffusion - Detect , Segment and Generate Anything with Text Inputs
JARVIS
JARVIS, a system to connect LLMs with ML community
llama
Inference code for LLaMA models
LMFlow
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Model for All.
LMOps
General technology for enabling AI capabilities w/ LLMs and MLLMs
Metaworld
Collections of robotics environments geared towards benchmarking multi-task and meta reinforcement learning
nebullvm
Plug and play modules to optimize the performances of your AI systems 🚀
taichi
Productive & portable high-performance programming in Python.
TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
ustcwhy.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
VideoMAE
[NeurIPS 2022 Spotlight] VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training