There are 1 repository under pointer-generator topic.
Multiple implementations for abstractive text summurization , using google colab
A Abstractive Summarization Implementation with Transformer and Pointer-generator
Datasets I have created for scientific summarization, and a trained BertSum model
My seq2seq based on tensorflow
The pytorch implementation of Get To The Point: Summarization with Pointer-Generator Networks.
Pytorch implementation of the ACL paper 'Get To The Point: Summarization with Pointer-Generator Networks (See et al., 2017)', adapted to a Korean dataset
Pointer Generator Network: Seq2Seq with attention, pointing and coverage mechanism for abstractive summarization.
Pytorch implementation of Get To The Point: Summarization with Pointer-Generator Networks (2017) by Abigail See et al.
An Abstractive Summarization(for Datasets in English format) Implementation with Transformer and Pointer-generator
Pytorch implementation of Get To The Point: Summarization with Pointer-Generator Networks (2017) by Abigail See et al.
Text Summarizer implemented in PyTorch
Pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Генерация новостных заголовков
resources for the paper 'Get To The Point: Summarization with Pointer-Generator Networks' with python3.x. overview on the post http://www.abigailsee.com/2017/04/16/taming-rnns-for-better-summarization.html or
An Implementation of Copy Seq2Seq
Code for Master's Thesis on 'Neural Automatic Summarization' written at the IT University of Copenhagen
The pointer-generator network does a better job at copying words from the source text. Additionally it also is able to copy out-of-vocabulary words allowing the algorithm to handle unseen words even if the corpus has a smaller vocabulary.
Corner stone seq2seq with attention (using bidirectional ltsm )
UCL Statistical Natural Language Process Group Project. Text summarization with Seq-2-seq, pointer generator, SeqGAN and PointerGAN.
An improved implementation of Beam Search Decoding in RNN-based Seq2Seq Architecture
# Comparing the performance of LSTM and GRU for Text Summarization using Pointer Generator Networks
Tensorflow 2.0 implementation of the Pointer-Generator network from the "Get to the Point" article (https://arxiv.org/abs/1704.04368)
Text Summarization using Residual Logarithmic LSTMs
Recalculating ROUGE scores for See et al. (2017) test outputs.
Pointer-Generator Networks with Different Word Embeddings for Abstractive Summarization
Get to the point - Pointer Generator Network