Xiaonan Nie's repositories
Awesome-Mixture-of-Experts-Papers
A curated reading list of research in Mixture-of-Experts(MoE).
azurehpc
This repository provides easy automation scripts for building a HPC environment in Azure. It also includes examples to build e2e environment and run some of the key HPC benchmarks and applications.
book
Deep Learning 101 with PaddlePaddle
caffe
Caffe: a fast open framework for deep learning.
codecaution.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
CS-Books
📚 Computer Science Books 计算机技术类书籍 PDF
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Hetu
A high-performance distributed deep learning system targeting large-scale and automated distributed training.
leaf
Leaf: A Benchmark for Federated Settings
machine-learning-notes
My continuously updated Machine Learning, Probabilistic Models and Deep Learning notes and demos (2000+ slides) 我不间断更新的机器学习,概率模型和深度学习的讲义(2000+页)和视频链接
Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
PKUAutoSubmit_online
萌新友好的,无需下载文件与配环境的,基于Github Actions的,P大学生出入校自动报备程序
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
tensorflow
An Open Source Machine Learning Framework for Everyone
tutel
Tutel MoE: An Optimized Mixture-of-Experts Implementation