I get this error when I use Example code for huggingface ,how can i fix it?
sammichenVV opened this issue · comments
Hello,
Can you give us further details on the task you are working on? is it a simple classification task? In that case , if you flatten your tensors, you will get the good result.
it's fixed by flatten tensors. my task is translation . I use xlm-roberta to fine tune. The way to load the model is XLMRobertaForCausalLM.from_pretrained('xlm-roberta-base'). and I fix it by flatten tensor "predictions=predictions.view(5120) references=batch["labels"].view(5120)".
Closing this as it seems resolved.
Could you elaborate on the flattening, @SkanderHellal? I get the same error for my setup and don't understand why it doesn't work, as training works with the same preparation of the dataset. Where can I read more on this?