`translate_iterable` is not properly handling `max_input_length`
winstxnhdw opened this issue · comments
Even after setting max_input_length = 0
, I get the following warning when exceeding 1024
.
RuntimeError: No position encodings are defined for positions >= 1024, but got position 1116
What kind of model did you use? If the encodings
variable exists in the model , the length of position encodings
will be fixed, so you would have this error. Otherwise, we will create position encodings matrix based on the length of the input.
I am using NLLB
and it seems you are right. The tokenizer_config.json
has set model_max_length
to 1024.