Violet5412's starred repositories
torchkeras
Pytorch❤️ Keras 😋😋
Awesome-LLM-for-RecSys
Survey: A collection of AWESOME papers and resources on the large language model (LLM) related recommender system topics.
LLM4Rec-Awesome-Papers
A list of awesome papers and resources of recommender system on large language model (LLM).
LLM-FineTuning-Large-Language-Models
LLM (Large Language Model) FineTuning
LLaMA-Factory
A WebUI for Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
tensor2tensor
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Awesome-Chinese-LLM
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
llm-cookbook
面向开发者的 LLM 入门教程,吴恩达大模型系列课程中文版
mistral-inference
Official inference library for Mistral models
MixtralKit
A toolkit for inference and evaluation of 'mixtral-8x7b-32kseqlen' from Mistral AI
vision-transformers-cifar10
Let's train vision transformers (ViT) for cifar 10!
paper-reading
深度学习经典、新论文逐段精读
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
llm-action
本项目旨在分享大模型相关技术原理以及实战经验。
ChatGLM-6B
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
soft-mixture-of-experts
PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers
recsys-mixture-of-experts
recsys with mixture-of-experts
Mixture_of_Experts
Implementation of Mixture of Experts paper
awesome-adaptive-computation
A curated reading list of research in Adaptive Computation, Dynamic Compute & Mixture of Experts (MoE).
federated-learning-mixture
Federated learning using a mixture of experts
MultimodalRecSys
A curated list of awesome resources about multimodal recommender systems.
coding-interview-university
A complete computer science study plan to become a software engineer.
annotated-transformer
An annotated implementation of the Transformer paper.