Dinghow Yang (Dinghow)

Dinghow

Geek Repo

Company:Peking University

Location:Hangzhou, China

Home Page:https://dinghow.site

Github PK Tool:Github PK Tool


Organizations
TJMSC

Dinghow Yang's starred repositories

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

Language:PythonLicense:Apache-2.0Stargazers:33990Issues:341Issues:2655

generative_agents

Generative Agents: Interactive Simulacra of Human Behavior

peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.

Language:PythonLicense:Apache-2.0Stargazers:15170Issues:105Issues:971

Qwen

The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.

Language:PythonLicense:Apache-2.0Stargazers:12768Issues:98Issues:1032

flash-attention

Fast and memory-efficient exact attention

Language:PythonLicense:BSD-3-ClauseStargazers:12576Issues:118Issues:911

tiktoken

tiktoken is a fast BPE tokeniser for use with OpenAI's models.

Language:PythonLicense:MITStargazers:11257Issues:167Issues:224

FlexGen

Running large language models on a single GPU for throughput-oriented scenarios.

Language:PythonLicense:Apache-2.0Stargazers:9089Issues:109Issues:81

GLM-130B

GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)

Language:PythonLicense:Apache-2.0Stargazers:7648Issues:98Issues:198

starcoder

Home of StarCoder: fine-tuning & inference!

Language:PythonLicense:Apache-2.0Stargazers:7212Issues:69Issues:141

InternLM

Official release of InternLM2.5 7B base and chat models. 1M context support

Language:PythonLicense:Apache-2.0Stargazers:5870Issues:54Issues:306

GroundingDINO

[ECCV 2024] Official implementation of the paper "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"

Language:PythonLicense:Apache-2.0Stargazers:5774Issues:37Issues:287

FasterTransformer

Transformer related optimization, including BERT, GPT

Language:C++License:Apache-2.0Stargazers:5674Issues:64Issues:623

tinyproxy

tinyproxy - a light-weight HTTP/HTTPS proxy daemon for POSIX operating systems

Language:CLicense:GPL-2.0Stargazers:4694Issues:106Issues:398

x-transformers

A simple but complete full-attention transformer with a set of promising experimental features from various papers

Language:PythonLicense:MITStargazers:4409Issues:53Issues:201

lmdeploy

LMDeploy is a toolkit for compressing, deploying, and serving LLMs.

Language:PythonLicense:Apache-2.0Stargazers:3493Issues:33Issues:1123

T2I-Adapter

T2I-Adapter

Language:PythonLicense:Apache-2.0Stargazers:3325Issues:40Issues:107

Baichuan-13B

A 13B large language model developed by Baichuan Intelligent Technology

Language:PythonLicense:Apache-2.0Stargazers:2976Issues:31Issues:194

lightllm

LightLLM is a Python-based LLM (Large Language Model) inference and serving framework, notable for its lightweight design, easy scalability, and high-speed performance.

Language:PythonLicense:Apache-2.0Stargazers:2109Issues:22Issues:169

Pointcept

Pointcept: a codebase for point cloud perception research. Latest works: PTv3 (CVPR'24 Oral), PPT (CVPR'24), OA-CNNs (CVPR'24), MSC (CVPR'23)

Language:PythonLicense:MITStargazers:1373Issues:20Issues:270

test

Measuring Massive Multitask Language Understanding | ICLR 2021

Language:PythonLicense:MITStargazers:1084Issues:20Issues:19

lagent

A lightweight framework for building LLM-based agents

Language:PythonLicense:Apache-2.0Stargazers:1073Issues:14Issues:47

LLM-in-Vision

Recent LLM-based CV and related works. Welcome to comment/contribute!

md2notion

A better Notion.so Markdown importer

Language:PythonLicense:MITStargazers:651Issues:5Issues:45

WanJuan1.0

万卷1.0多模态语料

PointLLM

[ECCV 2024] PointLLM: Empowering Large Language Models to Understand Point Clouds

labelbee

LabelBee is an annotation Library

Language:TypeScriptLicense:Apache-2.0Stargazers:237Issues:5Issues:14

examples

Lepton Examples

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:138Issues:10Issues:4

jabbrv

Automatic Journal Title Abbreviation Package for LaTeX