Cheng hou's repositories

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

Language:PythonLicense:MITStargazers:1Issues:0Issues:0

GPT2-Chinese

Chinese version of GPT2 training code, using BERT tokenizer.

Language:PythonLicense:MITStargazers:1Issues:0Issues:0

nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers

Language:Jupyter NotebookLicense:MITStargazers:1Issues:0Issues:0

TencentPretrain

Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo

Language:PythonLicense:NOASSERTIONStargazers:1Issues:0Issues:0

UER-py

Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo

Language:PythonLicense:Apache-2.0Stargazers:1Issues:0Issues:0
License:Apache-2.0Stargazers:0Issues:0Issues:0

transformers

🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0