facebookresearch / detr

End-to-End Object Detection with Transformers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Convert model to TorchScript

yingh16 opened this issue · comments

commented

Hello. I try to convert the model to TorchScript. However, when I run:

import torch
model = torch.hub.load('facebookresearch/detr', 'detr_resnet50', pretrained=True)
model = torch.jit.script(model)

I got the issue shown below

RuntimeError:
Return value was annotated as having type torch.util.misc.___torch_mangle_916.NestedTensor (of Python compilation unit at: 0000022C7BC38970) but is actually of type torch.util.misc.NestedTensor (of Python compilation unit at: 0000022C7BC38970):
File "C:\Users\xx/.cache\torch\hub\facebookresearch_detr_main\util\misc.py", line 298
else:
cast_mask = None
return NestedTensor(cast_tensor, cast_mask)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
'NestedTensor.to' is being compiled since it was called from 'torch.util.misc.NestedTensor'
File "C:\Users\xx/.cache\torch\hub\facebookresearch_detr_main\models\backbone.py", line 72
def forward(self, tensor_list: NestedTensor):
~~~~~~~~~~~~ <--- HERE
xs = self.body(tensor_list.tensors)
# out = OrderedDict()
'torch.util.misc.NestedTensor' is being compiled since it was called from 'Backbone.forward'
File "C:\Users\xx/.cache\torch\hub\facebookresearch_detr_main\models\backbone.py", line 72
def forward(self, tensor_list: NestedTensor):
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
xs = self.body(tensor_list.tensors)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# out = OrderedDict()
~~~~~~~~~~~~~~~~~~~~~
out: Dict[str, NestedTensor] = {}
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
for name, x in xs.items():
~~~~~~~~~~~~~~~~~~~~~~~~~~
m = tensor_list.mask
~~~~~~~~~~~~~~~~~~~~
assert m is not None
~~~~~~~~~~~~~~~~~~~~
mask = F.interpolate(m[None].float(), size=x.shape[-2:]).to(torch.bool)[0]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
out[name] = NestedTensor(x, mask)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
return out
~~~~~~~~~~ <--- HERE

I am not sure what I should do. Any suggestions? Thanks in advance!