joe-siyuan-qiao / DetectoRS

DetectoRS: Detecting Objects with Recursive Feature Pyramid and Switchable Atrous Convolution

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About ASPP Module

mhyeonsoo opened this issue · comments

Hi,

Thanks for a great codes.
Now I am trying to implement the code with my own dataset, and since I need to use tensorflow as a framework, I am modifying toe code based on the TF2.1.

It may be bit awkward to ask here, but I think it can be a question about module mechanism itself.

At ASPP model class, I saw that the Relu layer is coming after Global Average Pooling.
So when I ran it, it was saying that the dimensions of outputs of layer for '(aspp_idx == self.aspp_num - 1)' case and else.
For the first one, shape of the layer output was something like (None, 128, 128, 64) ,
and output of layer for '(aspp_idx == self.aspp_num - 1)' case was sth. like shape=(None, 256)

Is there any reason for using GAP before relu?
and If so, can I fix this error caused by dim. mismatch?

Thanks alot!

Hi, the ASPP module was proposed by DeepLab which has a global average pooling layer. Please check the TF implementation you have. It seems that the numbers of channels are not set correctly for the GAP layer.