nlpyang / BertSum

Code for paper Fine-tune BERT for Extractive Summarization

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to train TransformerExt baseline?

GongShuai8210 opened this issue · comments

How to train TransformerExt baseline?
I just change encoder with 'baseline', the Rouge scores are higher than the article decribes.I doubt that I still use Bert ,but i am not sure how to use Transformer correctly.