saurabhkulkarni77 / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • NEZHA is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.

About

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

License:Apache License 2.0


Languages

Language:Python 99.1%Language:Shell 0.7%Language:Dockerfile 0.1%