Wei Huang (Aaronhuang-778)

Aaronhuang-778

Geek Repo

Company:HKU - CVMI Lab (https://xjqi.github.io/cvmi.html)

Location:Hong Kong

Twitter:@AaronWeiHuang

Github PK Tool:Github PK Tool

Wei Huang's starred repositories

Language:PythonStargazers:128Issues:0Issues:0

torchchat

Run PyTorch LLMs locally on servers, desktop and mobile

Language:PythonLicense:BSD-3-ClauseStargazers:2547Issues:0Issues:0

segment-anything-2

The repository provides code for running inference with the Meta Segment Anything Model 2 (SAM 2), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:8571Issues:0Issues:0

unmet-promise

Code repository for the paper "The Unmet Promise of Synthetic Training Images: Using Retrieved Real Images Performs Better"

Language:PythonLicense:MITStargazers:3Issues:0Issues:0

Depth-Anything-V2

Depth Anything V2. A More Capable Foundation Model for Monocular Depth Estimation

Language:PythonLicense:Apache-2.0Stargazers:2830Issues:0Issues:0
Language:Jupyter NotebookLicense:Apache-2.0Stargazers:549Issues:0Issues:0

lmms-eval

Accelerating the development of large multimodal models (LMMs) with lmms-eval

Language:PythonLicense:NOASSERTIONStargazers:1185Issues:0Issues:0

flash-attention

Fast and memory-efficient exact attention

Language:PythonLicense:BSD-3-ClauseStargazers:12791Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:2Issues:0Issues:0
Language:PythonStargazers:1438Issues:0Issues:0

LayerMerge

Official PyTorch implementation of "LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging" (ICML'24)

Language:PythonLicense:MITStargazers:21Issues:0Issues:0

larq

An Open-Source Library for Training Binarized Neural Networks

Language:PythonLicense:Apache-2.0Stargazers:698Issues:0Issues:0

ChatTTS

A generative speech model for daily dialogue.

Language:PythonLicense:AGPL-3.0Stargazers:28851Issues:0Issues:0

BGEMM-CUDA

This is a repository of Binary General Matrix Multiply (BGEMM) by customized CUDA kernel. Thank FP6-LLM for the wheels!

Language:CudaLicense:Apache-2.0Stargazers:10Issues:0Issues:0

cambrian

Cambrian-1 is a family of multimodal LLMs with a vision-centric design.

Language:PythonLicense:Apache-2.0Stargazers:1628Issues:0Issues:0

ToMe

A method to increase the speed and lower the memory footprint of existing vision transformers.

Language:PythonLicense:NOASSERTIONStargazers:917Issues:0Issues:0

lmdeploy

LMDeploy is a toolkit for compressing, deploying, and serving LLMs.

Language:PythonLicense:Apache-2.0Stargazers:3731Issues:0Issues:0

InternVL

[CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的可商用开源多模态对话模型

Language:PythonLicense:MITStargazers:4660Issues:0Issues:0

outlines

Structured Text Generation

Language:PythonLicense:Apache-2.0Stargazers:7494Issues:0Issues:0

malaysian-dataset

We gather Malaysian dataset! https://malaysian-dataset.readthedocs.io/

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:294Issues:0Issues:0

DeepSeek-Coder-V2

DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models in Code Intelligence

License:MITStargazers:1609Issues:0Issues:0

llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training

Language:PythonLicense:Apache-2.0Stargazers:816Issues:0Issues:0

chameleon

Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR.

Language:PythonLicense:NOASSERTIONStargazers:1642Issues:0Issues:0

mistral-inference

Official inference library for Mistral models

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:9394Issues:0Issues:0

LLM-QAT

Code repo for the paper "LLM-QAT Data-Free Quantization Aware Training for Large Language Models"

Language:PythonLicense:NOASSERTIONStargazers:222Issues:0Issues:0

Quest

[ICML 2024] Quest: Query-Aware Sparsity for Efficient Long-Context LLM Inference

Language:CudaStargazers:105Issues:0Issues:0

moe-quantization

Official code for the paper "Examining Post-Training Quantization for Mixture-of-Experts: A Benchmark"

Language:PythonLicense:MITStargazers:6Issues:0Issues:0

DeepSeek-V2

DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model

License:MITStargazers:3210Issues:0Issues:0

hf-daily-paper-newsletter-chinese

HF🤗每日简报机器人

Language:PythonLicense:Apache-2.0Stargazers:32Issues:0Issues:0