Amy Tzu-Yu Chen's starred repositories
the-algorithm
Source code for Twitter's Recommendation Algorithm
open-interpreter
A natural language interface for computers
alpaca-lora
Instruct-tune LLaMA on consumer hardware
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Chinese-Vicuna
Chinese-Vicuna: A Chinese Instruction-following LLaMA-based Model —— 一个中文低资源的llama+lora方案,结构参考alpaca
llm-numbers
Numbers every LLM developer should know
Baichuan-13B
A 13B large language model developed by Baichuan Intelligent Technology
LLaMA2-Accessory
An Open-source Toolkit for LLM Development
insanely-fast-whisper
Incredibly fast Whisper-large-v3
Taiwan-LLM
Traditional Mandarin LLMs for Taiwan
filetype.py
Small, dependency-free, fast Python package to infer binary file types checking the magic numbers signature