desire2020 / NTG-Papers

Paper List for (Unconditional) Neural Text Generation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Paper Collection of Neural Text Generation (NTG)

Neural Text Generation refers to a kind of methods that mainly attempt to use NN as function approximators to mimic the underlying distribution of (natural) languages. The most important applications of the conditional version of this topic include Neural Machine Translation (NMT), neural image captioning and dialogue system (chatbot). However, the researches of NTG usually refer to those focus on the unconditional problem, that is to really learn the latent distribution of the target language (instead of a transformation mapping from source form to target form).

This repository presents a collection of previous research papers of Neural Text Generation (NTG), as well as a taxonomy constructed according to publication time, method paradigm or paper type.

Taxonomy of Papers

Survey and Theoretical Analysis

Metrics, Toolbox and Dataset

Online-available Course

Research Paper

About

Paper List for (Unconditional) Neural Text Generation