yitu-opensource / ConvBert

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

UnboundLocalError: local variable 'seq_length' referenced before assignment

zhuzihan728 opened this issue · comments

Hi, I am using the ConvBertForTokenClassification model in transformers and encountered the bug when passing only input_embeds to forward().
The traceback says that at line 833 in modeling_convbert.py

if token_type_ids is None:
    if hasattr(self.embeddings, "token_type_ids"):
        buffered_token_type_ids = self.embeddings.token_type_ids[:, :seq_length]

The seq_length is unassigned.

I noticed just above this piece of code that in

elif input_ids is not None:
    input_shape = input_ids.size()
    batch_size, seq_length = input_shape
elif inputs_embeds is not None:
    input_shape = inputs_embeds.size()[:-1]

seq_length is not assigned if the program enters elif inputs_embeds is not None.

Not sure if it is the batch_size, seq_length = input_shape missing for inputs_embeds or I am not using the model correctly?