Nagico's starred repositories
LLaMA-Factory
A WebUI for Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
Awesome-Chinese-LLM
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
Llama-Chinese
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
PowerInfer
High-speed Large Language Model Serving on PCs with Consumer-grade GPUs
lm-evaluation-harness
A framework for few-shot evaluation of language models.
Awesome-Incremental-Learning
Awesome Incremental Learning
flops-counter.pytorch
Flops counter for convolutional networks in pytorch framework
whisper-vits-svc
Core Engine of Singing Voice Conversion & Singing Voice Clone
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Awesome-LLM-Compression
Awesome LLM compression research papers and tools.
AssetStudio
AssetStudio - Based on the archived Perfare's AssetStudio, I continue Perfare's work to keep AssetStudio up-to-date, with support for new Unity versions and additional improvements.
Best-Incremental-Learning
An Incremental Learning, Continual Learning, and Life-Long Learning Repository
calculate-flops.pytorch
The calflops is designed to calculate FLOPs、MACs and Parameters in all various neural networks, such as Linear、 CNN、 RNN、 GCN、Transformer(Bert、LlaMA etc Large Language Model)
CuAssembler
An unofficial cuda assembler, for all generations of SASS, hopefully :)
how-to-learn-deep-learning-framework
how to learn PyTorch and OneFlow
ArknightsGameResource
明日方舟客户端素材
ai_and_memory_wall
AI and Memory Wall
Efficient_Foundation_Model_Survey
Survey Paper List - Efficient LLM and Foundation Models
BM-Training
Dive into Big Model Training
cuda-calculator
Online CUDA Occupancy Calculator
ArknightsAssets
Arknights assets