Support for "no_repeat_ngram_size" parameter for generation
shreysingla11 opened this issue · comments
There is currently no support for disabling the generation of recurring ngrams in the Bart model, as done by the parameter "no_repeat_ngram_size" in class "transformers.GenerationConfig". This causes the generation of the same token repeatedly.
Same problem.
same here