iyangming / awesome-LLM-resourses

🧑‍🚀 全世界最好的中文LLM资料总结

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

全世界最好的中文大语言模型资源汇总 持续更新

数据 Data

  1. AotoLabel: Label, clean and enrich text datasets with LLMs.
  2. LabelLLM: The Open-Source Data Annotation Platform.
  3. data-juicer: A one-stop data processing system to make data higher-quality, juicier, and more digestible for LLMs!
  4. OmniParser: a native Golang ETL streaming parser and transform library for CSV, JSON, XML, EDI, text, etc.
  5. MinerU: MinerU is a one-stop, open-source, high-quality data extraction tool, supports PDF/webpage/e-book extraction.
  6. PDF-Extract-Kit: A Comprehensive Toolkit for High-Quality PDF Content Extraction.
  7. Parsera: Lightweight library for scraping web-sites with LLMs.

微调 Fine-Tuning

  1. LLaMA-Factory: Unify Efficient Fine-Tuning of 100+ LLMs.
  2. unsloth: 2-5X faster 80% less memory LLM finetuning.
  3. TRL: Transformer Reinforcement Learning.
  4. Firefly: Firefly: 大模型训练工具,支持训练数十种大模型
  5. Xtuner: An efficient, flexible and full-featured toolkit for fine-tuning large models.
  6. torchtune: A Native-PyTorch Library for LLM Fine-tuning.
  7. Swift: Use PEFT or Full-parameter to finetune 200+ LLMs or 15+ MLLMs.
  8. AutoTrain: A new way to automatically train, evaluate and deploy state-of-the-art Machine Learning models.
  9. OpenRLHF: An Easy-to-use, Scalable and High-performance RLHF Framework (Support 70B+ full tuning & LoRA & Mixtral & KTO).
  10. Ludwig: Low-code framework for building custom LLMs, neural networks, and other AI models.
  11. mistral-finetune: A light-weight codebase that enables memory-efficient and performant finetuning of Mistral's models.
  12. aikit: Fine-tune, build, and deploy open-source LLMs easily!
  13. H2O-LLMStudio: H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs.
  14. LitGPT: Pretrain, finetune, deploy 20+ LLMs on your own data. Uses state-of-the-art techniques: flash attention, FSDP, 4-bit, LoRA, and more.
  15. LLMBox: A comprehensive library for implementing LLMs, including a unified training pipeline and comprehensive model evaluation.
  16. PaddleNLP: Easy-to-use and powerful NLP and LLM library.
  17. workbench-llamafactory: This is an NVIDIA AI Workbench example project that demonstrates an end-to-end model development workflow using Llamafactory.
  18. OpenRLHF: An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & Mixtral).
  19. TinyLLaVA Factory: A Framework of Small-scale Large Multimodal Models.
  20. LLM-Foundry: LLM training code for Databricks foundation models.
  21. lmms-finetune: A unified codebase for finetuning (full, lora) large multimodal models, supporting llava-1.5, qwen-vl, llava-interleave, llava-next-video, phi3-v etc.
  22. Simplifine: Simplifine lets you invoke LLM finetuning with just one line of code using any Hugging Face dataset or model.
  23. Transformer Lab: Open Source Application for Advanced LLM Engineering: interact, train, fine-tune, and evaluate large language models on your own computer.

推理 Inference

  1. ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models.
  2. Open WebUI: User-friendly WebUI for LLMs (Formerly Ollama WebUI).
  3. Text Generation WebUI: A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
  4. Xinference: A powerful and versatile library designed to serve language, speech recognition, and multimodal models.
  5. LangChain: Build context-aware reasoning applications.
  6. LlamaIndex: A data framework for your LLM applications.
  7. lobe-chat: an open-source, modern-design LLMs/AI chat framework. Supports Multi AI Providers, Multi-Modals (Vision/TTS) and plugin system.
  8. TensorRT-LLM: TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs.
  9. vllm: A high-throughput and memory-efficient inference and serving engine for LLMs.
  10. LlamaChat: Chat with your favourite LLaMA models in a native macOS app.
  11. NVIDIA ChatRTX: ChatRTX is a demo app that lets you personalize a GPT large language model (LLM) connected to your own content—docs, notes, or other data.
  12. LM Studio: Discover, download, and run local LLMs.
  13. chat-with-mlx: Chat with your data natively on Apple Silicon using MLX Framework.
  14. LLM Pricing: Quickly Find the Perfect Large Language Models (LLM) API for Your Budget! Use Our Free Tool for Instant Access to the Latest Prices from Top Providers.
  15. Open Interpreter: A natural language interface for computers.
  16. Chat-ollama: An open source chatbot based on LLMs. It supports a wide range of language models, and knowledge base management.
  17. chat-ui: Open source codebase powering the HuggingChat app.
  18. MemGPT: Create LLM agents with long-term memory and custom tools.
  19. koboldcpp: A simple one-file way to run various GGML and GGUF models with KoboldAI's UI.
  20. LLMFarm: llama and other large language models on iOS and MacOS offline using GGML library.
  21. enchanted: Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.
  22. Flowise: Drag & drop UI to build your customized LLM flow.
  23. Jan: Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM).
  24. LMDeploy: LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
  25. RouteLLM: A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
  26. MInference: About To speed up Long-context LLMs' inference, approximate and dynamic sparse calculate the attention, which reduces inference latency by up to 10x for pre-filling on an A100 while maintaining accuracy.
  27. Mem0: The memory layer for Personalized AI.
  28. SGLang: SGLang is yet another fast serving framework for large language models and vision language models.
  29. AirLLM: AirLLM optimizes inference memory usage, allowing 70B large language models to run inference on a single 4GB GPU card without quantization, distillation and pruning. And you can run 405B Llama3.1 on 8GB vram now.

评估 Evaluation

  1. lm-evaluation-harness: A framework for few-shot evaluation of language models.
  2. opencompass: OpenCompass is an LLM evaluation platform, supporting a wide range of models (Llama3, Mistral, InternLM2,GPT-4,LLaMa2, Qwen,GLM, Claude, etc) over 100+ datasets.
  3. llm-comparator: LLM Comparator is an interactive data visualization tool for evaluating and analyzing LLM responses side-by-side, developed.

体验 Usage

  1. LMSYS Chatbot Arena: Benchmarking LLMs in the Wild
  2. CompassArena 司南大模型竞技场
  3. 琅琊榜
  4. Huggingface Spaces
  5. WiseModel Spaces
  6. Poe
  7. 林哥的大模型野榜

RAG

  1. AnythingLLM: The all-in-one AI app for any LLM with full RAG and AI Agent capabilites.
  2. MaxKB: 基于 LLM 大语言模型的知识库问答系统。开箱即用,支持快速嵌入到第三方业务系统
  3. RAGFlow: An open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
  4. Dify: An open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
  5. FastGPT: A knowledge-based platform built on the LLM, offers out-of-the-box data processing and model invocation capabilities, allows for workflow orchestration through Flow visualization.
  6. Langchain-Chatchat: 基于 Langchain 与 ChatGLM 等不同大语言模型的本地知识库问答
  7. QAnything: Question and Answer based on Anything.
  8. Quivr: A personal productivity assistant (RAG) ⚡️🤖 Chat with your docs (PDF, CSV, ...) & apps using Langchain, GPT 3.5 / 4 turbo, Private, Anthropic, VertexAI, Ollama, LLMs, Groq that you can share with users ! Local & Private alternative to OpenAI GPTs & ChatGPT powered by retrieval-augmented generation.
  9. RAG-GPT: RAG-GPT, leveraging LLM and RAG technology, learns from user-customized knowledge bases to provide contextually relevant answers for a wide range of queries, ensuring rapid and accurate information retrieval.
  10. Verba: Retrieval Augmented Generation (RAG) chatbot powered by Weaviate.
  11. FlashRAG: A Python Toolkit for Efficient RAG Research.
  12. GraphRAG: A modular graph-based Retrieval-Augmented Generation (RAG) system.
  13. LightRAG: LightRAG helps developers with both building and optimizing Retriever-Agent-Generator pipelines.
  14. GraphRAG-Ollama-UI: GraphRAG using Ollama with Gradio UI and Extra Features.
  15. nano-GraphRAG: A simple, easy-to-hack GraphRAG implementation.
  16. RAG Techniques: This repository showcases various advanced techniques for Retrieval-Augmented Generation (RAG) systems. RAG systems combine information retrieval with generative models to provide accurate and contextually rich responses.
  17. ragas: Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines.

Agents

  1. AutoGen: AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks. AutoGen AIStudio
  2. CrewAI: Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
  3. Coze
  4. AgentGPT: Assemble, configure, and deploy autonomous AI Agents in your browser.
  5. XAgent: An Autonomous LLM Agent for Complex Task Solving.
  6. MobileAgent: The Powerful Mobile Device Operation Assistant Family.
  7. Lagent: A lightweight framework for building LLM-based agents.
  8. Qwen-Agent: Agent framework and applications built upon Qwen2, featuring Function Calling, Code Interpreter, RAG, and Chrome extension.
  9. LinkAI: 一站式 AI 智能体搭建平台
  10. Baidu APPBuilder
  11. agentUniverse: agentUniverse is a LLM multi-agent framework that allows developers to easily build multi-agent applications. Furthermore, through the community, they can exchange and share practices of patterns across different domains.
  12. LazyLLM: 低代码构建多Agent大模型应用的开发工具
  13. AgentScope: Start building LLM-empowered multi-agent applications in an easier way.
  14. MoA: Mixture of Agents (MoA) is a novel approach that leverages the collective strengths of multiple LLMs to enhance performance, achieving state-of-the-art results.
  15. Agently: AI Agent Application Development Framework.
  16. OmAgent: A multimodal agent framework for solving complex tasks.
  17. Tribe: No code tool to rapidly build and coordinate multi-agent teams.
  18. CAMEL: Finding the Scaling Law of Agents. A multi-agent framework.
  19. PraisonAI: PraisonAI application combines AutoGen and CrewAI or similar frameworks into a low-code solution for building and managing multi-agent LLM systems, focusing on simplicity, customisation, and efficient human-agent collaboration.
  20. IoA: An open-source framework for collaborative AI agents, enabling diverse, distributed agents to team up and tackle complex tasks through internet-like connectivity.
  21. llama-agentic-system : Agentic components of the Llama Stack APIs.
  22. Agent Zero: Agent Zero is not a predefined agentic framework. It is designed to be dynamic, organically growing, and learning as you use it.
  23. Agents: An Open-source Framework for Data-centric, Self-evolving Autonomous Language Agents.
  24. AgentScope: Start building LLM-empowered multi-agent applications in an easier way.

搜索 Search

  1. OpenSearch GPT: SearchGPT / Perplexity clone, but personalised for you.
  2. MindSearch: An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT).
  3. nanoPerplexityAI: The simplest open-source implementation of perplexity.ai.

书籍 Book

  1. 《大规模语言模型:从理论到实践》
  2. 《大语言模型》
  3. 《动手学大模型Dive into LLMs》
  4. 《动手做AI Agent》
  5. 《Build a Large Language Model (From Scratch)》
  6. 《多模态大模型》
  7. 《Generative AI Handbook: A Roadmap for Learning Resources》
  8. 《Understanding Deep Learning》
  9. 《Illustrated book to learn about Transformers & LLMs》

课程 Course

  1. 斯坦福 CS224N: Natural Language Processing with Deep Learning
  2. 吴恩达: Generative AI for Everyone
  3. 吴恩达: LLM series of courses
  4. ACL 2023 Tutorial: Retrieval-based Language Models and Applications
  5. llm-course: Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
  6. 微软: Generative AI for Beginners
  7. 微软: State of GPT
  8. HuggingFace NLP Course
  9. 清华 NLP 刘知远团队大模型公开课
  10. 斯坦福 CS25: Transformers United V4
  11. 斯坦福 CS324: Large Language Models
  12. 普林斯顿 COS 597G (Fall 2022): Understanding Large Language Models
  13. 约翰霍普金斯 CS 601.471/671 NLP: Self-supervised Models
  14. 李宏毅 GenAI课程
  15. openai-cookbook: Examples and guides for using the OpenAI API.
  16. Hands on llms: Learn about LLM, LLMOps, and vector DBS for free by designing, training, and deploying a real-time financial advisor LLM system.
  17. 滑铁卢大学 CS 886: Recent Advances on Foundation Models
  18. Mistral: Getting Started with Mistral
  19. 斯坦福 CS25: Transformers United V4
  20. Coursera: Chatgpt 应用提示工程
  21. LangGPT: Empowering everyone to become a prompt expert!
  22. mistralai-cookbook
  23. Introduction to Generative AI 2024 Spring
  24. build nanoGPT: Video+code lecture on building nanoGPT from scratch.
  25. LLM101n: Let's build a Storyteller.
  26. Knowledge Graphs for RAG
  27. LLMs From Scratch (Datawhale Version)
  28. OpenRAG
  29. 通往AGI之路
  30. Andrej Karpathy - Neural Networks: Zero to Hero
  31. Interactive visualization of Transformer
  32. andysingal/llm-course
  33. LM-class

教程 Tutorial

  1. 动手学大模型应用开发
  2. AI开发者频道
  3. B站:五里墩茶社
  4. B站:木羽Cheney
  5. YTB:AI Anytime
  6. B站:漆妮妮
  7. Prompt Engineering Guide
  8. YTB: AI超元域
  9. B站:TechBeat人工智能社区
  10. B站:黄益贺
  11. B站:深度学习自然语言处理
  12. LLM Visualization
  13. 知乎: 原石人类
  14. B站:小黑黑讲AI
  15. B站:面壁的车辆工程师
  16. B站:AI老兵文哲

论文 Paper

  1. Hermes-3-Technical-Report
  2. The Llama 3 Herd of Models
  3. Qwen Technical Report
  4. Qwen2 Technical Report
  5. DeepSeek LLM: Scaling Open-Source Language Models with Longtermism
  6. DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model
  7. Baichuan 2: Open Large-scale Language Models
  8. DataComp-LM: In search of the next generation of training sets for language models
  9. OLMo: Accelerating the Science of Language Models
  10. MAP-Neo: Highly Capable and Transparent Bilingual Large Language Model Series
  11. Chinese Tiny LLM: Pretraining a Chinese-Centric Large Language Model

Tips

  1. What We Learned from a Year of Building with LLMs (Part I)
  2. What We Learned from a Year of Building with LLMs (Part II)
  3. What We Learned from a Year of Building with LLMs (Part III): Strategy
  4. 轻松入门大语言模型(LLM)
  5. LLMs for Text Classification: A Guide to Supervised Learning
  6. Unsupervised Text Classification: Categorize Natural Language With LLMs
  7. Text Classification With LLMs: A Roundup of the Best Methods
  8. LLM Pricing

Forkers repo roster for @WangRongsheng/awesome-LLM-resourses

Stargazers repo roster for @WangRongsheng/awesome-LLM-resourses

Stargazers over time

About

🧑‍🚀 全世界最好的中文LLM资料总结