microsoft / DynamicHead

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Defining Spatial-aware Attention Layer error

Aliweka2020 opened this issue · comments

when running this line of code
spatial_output = spatial_layer(scale_output)

I got this error
~\AppData\Local\Continuum\anaconda3\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
725 result = self._slow_forward(*input, **kwargs)
726 else:
--> 727 result = self.forward(*input, **kwargs)
728 for hook in itertools.chain(
729 _global_forward_hooks.values(),

TypeError: forward() takes 3 positional arguments but 4 were given

Thanks for your interests! The code is released now. Please check it out.

upgrade torch version from 7.1.0 to 10.1.0 solve the problem