There are 10 repositories under nlg topic.
Go to https://github.com/pytorch/tutorials - this repo is deprecated and no longer maintained
Text generator is a handy plugin for Obsidian that helps you generate text content using GPT-3 (OpenAI).
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
Neural question generation using transformers
🎯🗯 Dataset generation for AI chatbots, NLP tasks, named entity recognition or text classification models using a simple DSL!
Accelerated Text is a no-code natural language generation platform. It will help you construct document plans which define how your data is converted to textual descriptions varying in wording and structure.
Author: Wenhao Yu (wyu1@nd.edu). ACM Computing Survey'22. Reading list for knowledge-enhanced text generation, with a survey.
A curated list of resources dedicated to Natural Language Generation (NLG)
Neural network-based chess engine capable of natural language commentary
Abstractive summarisation using Bert as encoder and Transformer Decoder
中文文本生成(NLG)之文本摘要(text summarization)工具包, 语料数据(corpus data), 抽取式摘要 Extractive text summary of Lead3、keyword、textrank、text teaser、word significance、LDA、LSI、NMF。(graph,feature,topic model,summarize tool or tookit)
An NLP system for generating reading comprehension questions
A Repo to store the Google Colaboratory Notebooks that I have created and shared
Papers and Book to look at when starting AGI 📚
✒️ Cedille is a large French language model (6B), released under an open-source license
The PyTorch implementation of fine-tuning the GPT-2(Generative Pre-trained Transformer 2) for dialogue generation.
Chart-to-Text: Generating Natural Language Explanations for Charts by Adapting the Transformer Model
AAAI-20 paper: Cross-Lingual Natural Language Generation via Pre-Training
This repository have scripts and Jupyter-notebooks to perform all the different steps involved in Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer
This repository contains the collection of explorative notebooks pure in python and in the language that we, humans can read. Have tried to compile all lectures from the Andrej Karpathy's 💎 playlist on Neural Networks - which we will end up with building GPT.
Code for A Hierarchical Model for Data-to-Text Generation (Rebuffel, Soulier, Scoutheeten, Gallinari; ECIR 2020)
MediaWiki extension to handle multilingual abstract content