nlpyang / BertSum

Code for paper Fine-tune BERT for Extractive Summarization

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

default batch_size is 3000, I don't quite understand, why so huge?

yongzhuo opened this issue · comments

default batch_size is 3000, I don't quite understand, why so huge?

The batch_size here is not the common sense batch size, here seems mean the token number in the batch. Check "batch" method in data_loader.py for detail.