ZJLab-DataHub&Security's repositories
LLaMA-Factory
Unify Efficient Fine-Tuning of 100+ LLMs
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
NOASSERTION000
Language:Python000
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Apache-2.0000