Kun-Lin Lee's starred repositories
DeepSpeedExamples
Example models using DeepSpeed
Semi-supervised-learning
A Unified Semi-Supervised Learning Codebase (NeurIPS'22)
awesome-semi-supervised-learning
š An up-to-date & curated list of awesome semi-supervised learning papers, methods & resources.
awesome-competitive-programming
:gem: A curated list of awesome Competitive Programming, Algorithm and Data Structure resources
cp-algorithms
Algorithm and data structure articles for https://cp-algorithms.com (based on http://e-maxx.ru)
nn-zero-to-hero
Neural Networks: Zero to Hero
LocalAI
:robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
open-interpreter
A natural language interface for computers
evolutionary-model-merge
Official repository of Evolutionary Optimization of Model Merging Recipes
LLaMA-Factory
A WebUI for Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
LLMs-from-scratch
Implementing a ChatGPT-like LLM in PyTorch from scratch, step by step
Awesome-LLM-RAG
Awesome-LLM-RAG: a curated list of advanced retrieval augmented generation (RAG) in Large Language Models
gemma_pytorch
The official PyTorch implementation of Google's Gemma models
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
recurrent-memory-transformer-pytorch
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
block-recurrent-transformer-pytorch
Implementation of Block Recurrent Transformer - Pytorch