pjlab-sys4nlp / llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training

Home Page:https://arxiv.org/abs/2406.16554

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

pjlab-sys4nlp/llama-moe Watchers