There are 26 repositories under abstractive-text-summarization topic.
A curated list of resources dedicated to text summarization
Multiple implementations for abstractive text summurization , using google colab
Abstractive summarisation using Bert as encoder and Transformer Decoder
This repository contains the code, data, and models of the paper titled "XL-Sum: Large-Scale Multilingual Abstractive Summarization for 44 Languages" published in Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021.
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Abstractive Text Summarization using Transformer
ACL 2020 Unsupervised Opinion Summarization as Copycat-Review Generation
[AAAI2021] Unsupervised Opinion Summarization with Content Planning
[ACL-IJCNLP 2021] Self-Supervised Multimodal Opinion Summarization
[ACL2020] Unsupervised Opinion Summarization with Noising and Denoising
An optimized Transformer based abstractive summarization model with Tensorflow
SumSimple is a FastAPI-based text summarization service using traditional, non-LLM algorithms like SumBasic, Luhn, Edmundson, LexRank, TextRank, and LSA.
Pytorch implementation of Get To The Point: Summarization with Pointer-Generator Networks (2017) by Abigail See et al.
non-anonymized cnn/dailymail dataset for text summarization
Generates summary of a given news article. Used attention seq2seq encoder decoder model.
Speaker Diarization + Speech to text + abstract summerization
Abstractive Summarization in the Nepali language
Abstractive Text Summarization using Transformer model
Abstractiv Text Summarization
tensorflow2 implementation of se2seq with attention for context generation
Text summarization with human feed-back
An ai-as-a-service for abstractive text summarizaion
Corner stone seq2seq with attention (using bidirectional ltsm )
Abstractive Text Summarization of Amazon reviews. Using LSTM model summary of full review is abstracted
Prepare Text Reviews Summary
Abstractive text summarisation using BART model on articles data.
Abstractive text summarization generates a shorter version of a given sentence while attempting to preserve its contextual meaning. In our approach we model the problem using an attentional encoder decoder which ensures that the decoder focuses on the appropriate input words at each step of our generation.
Using a deep learning model that takes advantage of LSTM and a custom Attention layer, we create an algorithm that is able to train on reviews and existent summaries to churn out and generate brand new summaries of its own.
Summarizing text to extract key ideas and arguments
perfroming abstractive text summariztion task using T5, and serving it via REST API
[Computer Speech & Language, Elsevier] - Neural Sentence Fusion for Diversity Driven Abstractive Multi-Document Summarization.
Transforming lengthy textual content into concise and meaningful summaries is the essence of this project. Leveraging the power of the Pegasus model, our abstractive text summarization repository aims to distill complex information into succinct and coherent summaries. Pegasus, state-of-the-art pre-trained model, excel in generating human like text