stdjhs / nlp_machine_learning_papers

NLP 和机器学习论文中文翻译

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

NLP 和机器学习论文中文翻译

在阅读过程中发现有翻译不好的地方,可以直接修改

有其它想要阅读和翻译的论文,将论文转换成 HTML 后上传,有问题请在 Issues 中留言

LaTex 转 HTML 的方法参见 Wiki

Big Self-Supervised Models are Strong Semi-Supervised Learners, 2020, 图像领域无监督预训练 + 有监督微调 SimCLR v2

ELECTRA: PRE-TRAINING TEXT ENCODERS AS DISCRIMINATORS RATHER THAN GENERATORS, 2020, ELECTRA 生成器+判别器预训练语言模型

DialoGLUE: A Natural Language Understanding Benchmark for Task-Oriented Dialogue, 2020, DialoGLUE:用于面向任务对话的自然语言理解基准

DIET: Lightweight Language Understanding for Dialogue Systems, 2020, Rasa DIET:对话系统的轻量级语言理解

ConveRT: Efficient and Accurate Conversational Representations from Transformers, 2020, ConveRT:基于 Transformer 的高效、准确的会话表示

ConvLab-2: An Open-Source Toolkit for Building, Evaluating, and Diagnosing Dialogue Systems, 2020, 对话系统工具包 ConLab-2

Poly-encoders: architectures and pre-training strategies for fast and accurate multi-sentence scoring, 2020, 多语句评分 poly-encoder

FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence, 2020, 半监督学习 FixMatch

XLNet: Generalized Autoregressive Pretraining for Language Understanding, 2019, xlnet

Unsupervised Data Augmentation for Consistency Training, 2019, 无监督数据的增强 UDA

EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks, 2019, 有监督数据的增强 EDA

Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks, 2019, sentence-bert

RoBERTa: A Robustly Optimized BERT Pretraining Approach, 2019, 预训练方法优化的 BERT, Roberta

Albert: A Lite Bert For Self-Supervised Learning Of Language Representations, 2019, 内存优化的 BERT,Albert

An Evaluation Dataset for Intent Classification and Out-of-Scope Prediction, 2019, Intent Classification dataset

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, 2018, bert

WAV2LETTER++: THE FASTEST OPEN-SOURCE SPEECH RECOGNITION SYSTEM, 2018, WAV2LETTER++:最快的开源语音识别系统

Attention Is All You Need, 2017, Transformer

Layer Normalization, 2016, 层归一化 Layer Normalization

Gaussian Error Linear Units (GELUs), 2016, 非线性 gelu

Neural Machine Translation of Rare Words with Subword Units, 2016, 子词 subword, BPE

Pointer Networks, 2015, 指针网络 pointer networks

Effective Approaches to Attention-based Neural Machine Translation, 2015

Distilling the Knowledge in a Neural Network, 2015, 知识蒸馏 Knowledge distilling

Neural Machine Translation by Jointly Learning to Align and Translate, 2014, Seq2seq with attention

Convolutional Neural Networks for Sentence Classification, 2014, 文本分类 textcnn

BLEU: a Method for Automatic Evaluation of Machine Translation, 2002, 机器翻译评价指标 bleu

About

NLP 和机器学习论文中文翻译