shatu / Generating_Text_Summary_With_GPT2

A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Generating Text Summary With GPT2

Accompanying code for blog Generating Text Summaries Using GPT-2 on PyTorch with Minimal Training.

Dataset Preparation

Run max_article_sizes.py for both CNN and Daily Mail Tokenized articles separately. It will create pickle files of sizes of each CNN/DAILY MAIL articles.
$ python max_article_sizes.py path/to/cnn_or_dailymail/tokenized/articles
Run below command to prepare json files which contains tokenized articles and summaries
$ python prepare_data.py path/to/pickle_file/of/articles/sizes/created/using/above/command

Training

Use pretrained weights to finetune the GPT2 model using tricks mentioned in Generating Text Summaries Using GPT-2 on PyTorch with Minimal Training on your data.

$ python train_gpt2_summarizer.py --batch_size 1 --root_dir path/to/json/files/created/using/prepare_data.py

Credit

Sample Efficient Text Summarization Using a Single Pre-Trained Transformer

Urvashi Khandelwal, Kevin Clark, Dan Jurafsky, Lukasz Kaiser

Training code in this repo has been adapted from huggingface run_lm_finetuning.py.

About

A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.


Languages

Language:Jupyter Notebook 80.5%Language:Python 19.5%