Problem with packed sequence
sebamenabar opened this issue · comments
Seba commented
Hello, I have a problem when using a packed sequence and RNNs, I get the following error:
/usr/local/lib/python3.7/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
493 result = self.forward(*input, **kwargs)
494 for hook in self._forward_hooks.values():
--> 495 hook_result = hook(self, input, result)
496 if hook_result is not None:
497 raise RuntimeError(
/usr/local/lib/python3.7/site-packages/torchsummaryX/torchsummaryX.py in hook(module, inputs, outputs)
27 info["id"] = id(module)
28 if isinstance(outputs, (list, tuple)):
---> 29 info["out"] = list(outputs[0].size())
30 else:
31 info["out"] = list(outputs.size())
AttributeError: 'PackedSequence' object has no attribute 'size'
It was generated with something like this:
encoder = LSTM(..., batch_first=True, bidirectional=True)
question = nn.utils.rnn.pack_padded_sequence(question, question_len, batch_first=True)
contextual_words, (question_embedding, _) = encoder(question)
Error rises here
--> 244 contextual_words, (question_embedding, _) = encoder(question)
Namhyuk Ahn commented
@sebamenabar Check the latest version (1.3.0).
Seba commented
Thanks, worked correctly for me now.