bytedance / lightseq

LightSeq: A High Performance Library for Sequence Processing and Generation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to resolve xlm-roberta convert fail

520jefferson opened this issue · comments

You are using a model of type xlm-roberta to instantiate a model of type bert. This is not supported for all configurations of models and can yield errors.
image
when i can revert , then test i met :====================START warmup====================
=========lightseq=========
lightseq generating...
type(inputs): <class 'torch.Tensor'> <class 'torch.Tensor'>
Traceback (most recent call last):
File "test/ls_bert.py", line 115, in
main()
File "test/ls_bert.py", line 95, in main
warmup(tokenizer, ls_model, hf_model, sentences)
File "test/ls_bert.py", line 51, in warmup
ls_generate(ls_model, inputs_id, attn_mask)
File "test/ls_bert.py", line 31, in ls_generate
ls_output, ls_time = ls_bert(model, inputs_id, attn_mask)
File "test/ls_bert.py", line 13, in ls_bert
ls_output = model.infer(inputs, attn_mask)
File "test/ls_bert.py", line 63, in infer
last_hidden_states = self.ls_bert.infer(inputs.numpy())
TypeError: infer(): incompatible function arguments. The following argument types are supported:
1. (self: lightseq.inference.Bert, input_seq: numpy.ndarray[int32], attn_mask: numpy.ndarray[int32]) -> numpy.ndarray[float32]