Liangyu Chen (cliangyu)

cliangyu

Geek Repo

Company:Nanyang Technological University

Location:Singapore

Home Page:cliangyu.com

Twitter:@cliangyu_

Github PK Tool:Github PK Tool

Liangyu Chen's repositories

Cola

[NeurIPS2023] Official implementation of the paper "Large Language Models are Visual Reasoning Coordinators"

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:94Issues:3Issues:1

CSVAL

[MIDL 2023] Official Imeplementation of "Making Your First Choice: To Address Cold Start Problem in Vision Active Learning"

Language:PythonLicense:MITStargazers:31Issues:3Issues:1
Language:PythonLicense:MITStargazers:4Issues:2Issues:0

random_hacks

Random hacks that I need to keep happy

License:MITStargazers:3Issues:1Issues:0
Language:PythonStargazers:1Issues:1Issues:0
Language:JavaScriptStargazers:0Issues:1Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Stargazers:0Issues:1Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:2Issues:0

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:ShellStargazers:0Issues:2Issues:0

Emu

Emu: An Open Multimodal Generalist

Language:PythonStargazers:0Issues:0Issues:0

fast-stable-diffusion

fast-stable-diffusion, +25-50% speed increase + memory efficient + DreamBooth

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:0Issues:2Issues:0

litellm

Call all LLM APIs using the OpenAI format. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs)

License:MITStargazers:0Issues:0Issues:0

llama-recipes

Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama3 for WhatsApp & Messenger.

Stargazers:0Issues:0Issues:0

llama3

the main Llama 3 GitHub site - will be moved under Meta-Llama

License:NOASSERTIONStargazers:0Issues:0Issues:0

LLaVA

Visual Instruction Tuning: Large Language-and-Vision Assistant built towards multimodal GPT-4 level capabilities.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

LLMSpeculativeSampling

Fast inference from large lauguage models via speculative decoding

Stargazers:0Issues:0Issues:0

Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0
Stargazers:0Issues:1Issues:0

OFA

Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

open_flamingo

An open-source framework for training large multimodal models.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

optimum

🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:MITStargazers:0Issues:0Issues:0

visitor-badge

A badge generator service to count visitors of your markdown file.

Language:HTMLLicense:GPL-3.0Stargazers:0Issues:0Issues:0

visual-chatgpt

VisualChatGPT

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

yang-song.github.io

Personal website

Language:JavaScriptLicense:MITStargazers:0Issues:0Issues:0