cfzd / FcaNet

FcaNet: Frequency Channel Attention Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

dct_h and dct_w

myasser63 opened this issue · comments

How can I set dct_h and dct_w if i want to add FCA layer into another model. My feature maps for the layer I want to inset Fca layer are 160x160, 80x80, 40x40, 20x20

Please advise.

commented

@myasser63
You can directly add the FCA layer without any modification. The feature map's size would be addressed automatically as here:

FcaNet/model/layer.py

Lines 54 to 55 in aa5fb63

if h != self.dct_h or w != self.dct_w:
x_pooled = torch.nn.functional.adaptive_avg_pool2d(x, (self.dct_h, self.dct_w))

So I should leave dct_h and dct_w like that or set to feature maps sizes.

self.FCA = MultiSpectralAttentionLayer(in_channels, self.dct_h, self.dct_w)

commented

Whatever you want. You can set it according to your preferences or use the settings as ours:

c2wh = dict([(64,56), (128,28), (256,14) ,(512,7)])

self.att = MultiSpectralAttentionLayer(planes * 4, c2wh[planes], c2wh[planes], reduction=reduction, freq_sel_method = 'top16')

I am trying this way and getting this:

  self.FCA =  MultiSpectralAttentionLayer(c1, c2wh[c1], c2wh[c1])

Error: RuntimeError: adaptive_avg_pool2d_backward_cuda does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True)'. You can turn off determinism just for this operation, or you can use the 'warn_only=True' option, if that's acceptable for your application. You can also file an issue at https://github.com/pytorch/pytorch/issues

commented

@myasser63
As the error says, it's a problem of adaptive_avg_pool2d. You can either just ignore it by:

torch.use_deterministic_algorithms(True, warn_only=True)

or you can turn off determinism by:

torch.use_deterministic_algorithms(False)