NVIDIA / FasterTransformer

Transformer related optimization, including BERT, GPT

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support for "no_repeat_ngram_size" parameter for generation

shreysingla11 opened this issue · comments

There is currently no support for disabling the generation of recurring ngrams in the Bart model, as done by the parameter "no_repeat_ngram_size" in class "transformers.GenerationConfig". This causes the generation of the same token repeatedly.

Same problem.

same here