Lianke Qin (brucechin)

brucechin

Geek Repo

Company:Bytedance Inc

Home Page:liankeqin.com

Github PK Tool:Github PK Tool

Lianke Qin's starred repositories

VALI

Video processing in Python

Language:C++License:Apache-2.0Stargazers:19Issues:0Issues:0

OpenDiT

OpenDiT: An Easy, Fast and Memory-Efficient System for DiT Training and Inference

Language:PythonLicense:Apache-2.0Stargazers:1051Issues:0Issues:0

DiT

Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"

Language:PythonLicense:NOASSERTIONStargazers:5539Issues:0Issues:0

veScale

A PyTorch Native LLM Training Framework

Language:PythonLicense:Apache-2.0Stargazers:467Issues:0Issues:0

MetaCLIP

ICLR2024 Spotlight: curation/training code, metadata, distribution and pre-trained models for MetaCLIP; CVPR 2024: MoDE: CLIP Data Experts via Clustering

Language:PythonLicense:NOASSERTIONStargazers:1094Issues:0Issues:0

triton

Development repository for the Triton language and compiler

Language:C++License:MITStargazers:11773Issues:0Issues:0

LLMs_interview_notes

该仓库主要记录 大模型(LLMs) 算法工程师相关的面试题

License:Apache-2.0Stargazers:1111Issues:0Issues:0

TensorRT-LLM

TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.

Language:C++License:Apache-2.0Stargazers:7266Issues:0Issues:0

flash-attention

Fast and memory-efficient exact attention

Language:PythonLicense:BSD-3-ClauseStargazers:11671Issues:0Issues:0

llm-engine

Scale LLM Engine public repository

Language:PythonLicense:Apache-2.0Stargazers:752Issues:0Issues:0

DeepSeek-LLM

DeepSeek LLM: Let there be answers

Language:MakefileLicense:MITStargazers:1307Issues:0Issues:0

FasterTransformer

Transformer related optimization, including BERT, GPT

Language:C++License:Apache-2.0Stargazers:5609Issues:0Issues:0

NeMo

A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)

Language:PythonLicense:Apache-2.0Stargazers:10592Issues:0Issues:0

LLaMA-Factory

Unify Efficient Fine-Tuning of 100+ LLMs

Language:PythonLicense:Apache-2.0Stargazers:24805Issues:0Issues:0

FlexGen

Running large language models on a single GPU for throughput-oriented scenarios.

Language:PythonLicense:Apache-2.0Stargazers:9062Issues:0Issues:0
Stargazers:82Issues:0Issues:0

FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.

Language:PythonLicense:Apache-2.0Stargazers:35395Issues:0Issues:0

deepsparse

Sparsity-aware deep learning inference runtime for CPUs

Language:PythonLicense:NOASSERTIONStargazers:2920Issues:0Issues:0

llm-hallucination-survey

Reading list of hallucination in LLMs. Check out our new survey paper: "Siren’s Song in the AI Ocean: A Survey on Hallucination in Large Language Models"

Stargazers:845Issues:0Issues:0

FlagAI

FlagAI (Fast LArge-scale General AI models) is a fast, easy-to-use and extensible toolkit for large-scale model.

Language:PythonLicense:Apache-2.0Stargazers:3795Issues:0Issues:0

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

Language:PythonLicense:Apache-2.0Stargazers:33504Issues:0Issues:0

text-generation-inference

Large Language Model Text Generation Inference

Language:PythonLicense:Apache-2.0Stargazers:8278Issues:0Issues:0

vllm

A high-throughput and memory-efficient inference and serving engine for LLMs

Language:PythonLicense:Apache-2.0Stargazers:21497Issues:0Issues:0

beringei

Beringei is a high performance, in-memory storage engine for time series data.

Language:C++License:NOASSERTIONStargazers:3166Issues:0Issues:0

chatbox

User-friendly Desktop Client App for AI Models/LLMs (GPT, Claude, Gemini, Ollama...)

Language:TypeScriptLicense:GPL-3.0Stargazers:19543Issues:0Issues:0

risc0

RISC Zero is a zero-knowledge verifiable general computing platform based on zk-STARKs and the RISC-V microarchitecture.

Language:C++License:Apache-2.0Stargazers:1533Issues:0Issues:0

awesome-quant

A curated list of insanely awesome libraries, packages and resources for Quants (Quantitative Finance)

Language:PythonStargazers:16677Issues:0Issues:0

lora

Using Low-rank adaptation to quickly fine-tune diffusion models.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:6735Issues:0Issues:0