AwesomeSEG
This repo contains our research summary for Story Ending Generation (SEG), and we also provide the codes and generated results of our work on SEG.
Contents:
1.SEG Task
Story ending generation is the task of generating an ending sentence of a story given a story context. For example, given the story context:
Today is Halloween.
Jack is so excited to go trick or treating tonight.
He is going to dress up like a monster.
The costume is real scary.
We hope the SEG model could generate a reasonable ending for the above story, such as:
He hopes to get a lot of candy.
1.1 Dataset - ROCStories Corpus
Existing SEG works all utilize ROCStories Corpus to evaluate performances of SEG model. Specifically, the ROCStories Corpus contains 98,162 five-sentence stories, in which the first four sentences is used as story context while the last one is regarded as story ending sentence.
1.2 Existing Work
The concepts used in Tags are illustrated as follows:
- arch:The architecture of the model, includes
arch-LSTM
、arch-GRU
、arch-Transformer
andarch-GCN
tags. - train:The training strategy of the model, includes
train-MLE
、train-GAN
andtrain-ITF
tags. - info:The additional infomation used in SEG, includes
info-Keywords
、info-Sentiment
、info-knowledge
、info-DP
(Dependency Parsing) andinfo-Controllable
tags. - task:
task-Metric
tag indicates the evaluation work.
2.SHGN
We provide the codes and generated results of the DASFAA 2022 paper Incorporating Commonsense Knowledge into Story Ending Generation via Heterogeneous Graph Networks.
Codes
Please refer to the SHGN directory.
Generated results
The generated results of our SHGN are available at SHGN.txt. To reproduce the evaluation scores of our paper, please use nlg-eval and py-rouge toolkits to calculate BLEU and ROUGE scores, respectively.