exiawsh / StreamPETR

[ICCV 2023] StreamPETR: Exploring Object-Centric Temporal Modeling for Efficient Multi-View 3D Object Detection

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

about the shape of norm operation?

czy341181 opened this issue · comments

hi, thanks for your great work!

I have a question about the shape of norm operation.

query = self.norms[norm_index](query)

the input of norm layer is [num_query, bs, embed_dim].
Is it a bug?

I found that different shapes can lead to differences in the calculation results .
Do you know why?

@czy341181 Thanks for your attention. What do you think is the right operation?

Do you mean it should be changed to:
elif layer == 'norm':
query = query.transpose(0, 1)
query = self.normsnorm_index
query = query.transpose(0, 1)
norm_index += 1
The results is normal.
We use batch_first = True in multihead attention by default.

Should the results calculated by the two be equal?

The mAP result is the same as the original implementation.

What I mean is, in this operation, the different shape of input can lead to different results

You change the order of dimension. It's normal the result will change. Because the dim order is not the same. But layer norm only operates in the last dim, just the order changes, the numerical value is the same.

@czy341181 If you still have questions, feel free to reopen the issue.