Nonu's starred repositories
edgetunnel
在原版的基础上修改了显示 VLESS 配置信息转换为订阅内容。使用该脚本,你可以方便地将 VLESS 配置信息使用在线配置转换到 Clash 或 Singbox 等工具中。
PowerInfer
High-speed Large Language Model Serving on PCs with Consumer-grade GPUs
direct-preference-optimization
Reference implementation for DPO (Direct Preference Optimization)
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
RedPajama-Data
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
self-instruct
Aligning pretrained language models with instruction data generated by themselves.
stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
Learning-Prompt
Free prompt engineering online course. ChatGPT and Midjourney tutorials are now included!
MochiDiffusion
Run Stable Diffusion on Mac natively
llama_index
LlamaIndex is a data framework for your LLM applications
openai-cookbook
Examples and guides for using the OpenAI API
alpaca.cpp
Locally run an Instruction-Tuned Chat-Style LLM