KunWangR's starred repositories
chinese_chatbot_corpus
中文公开聊天语料库
bert4keras
keras implement of transformers for humans
embedding_study
中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果
nlp_research
NLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
simple_elmo
Simple library to work with pre-trained ELMo models in TensorFlow
bert_and_ernie
TensorFlow code and pre-trained models for BERT and ERNIE
bert-distillation
Language Model Distillation for Text Classification (CS 244r, Spring 2019)
to_distill_or_not
model compression using bert and knowledge distillation
roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Knowledge-distillation-using-BERT
Using Google's pretrained language model, BERT, I aim to train a smaller bidirectional LSTM model with knowledge distillation. The kaggle dataset from Toxic Comment classification is used for this task.
nlp_chinese_corpus
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
bert_language_understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Capsule-Text-Classification
利用keras搭建的胶囊网络(capsule network文本分类模型,包含RNN、CNN、HAN等,其中keras_utils包含了capsule层和attention层的keras实现
simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
distillation-BERT
knowledge distillation on BERT
distill-bert
Knowledge Distillation from BERT
Chinese-Word-Vectors
100+ Chinese Word Vectors 上百种预训练中文词向量