Possibly redundant BatchNorm layer?
zhikaili opened this issue · comments
@xptree Thank you for the code. It seems that there are two cascaded BatchNorm layers in each GIN layer. I am wondering whether one of them is redundant.
Specifically,
In the UnsupervisedGIN class, BNs are instantiated (use_selayer is False, as in the code):
and called during forward:
Meanwhile,
In the ApplyNodeFunc class, a BN is again instantiated and called during forward (use_selayer is False, as in the code):
So in each layer, there are two cascaded BNs, between which there is only a ReLU activation.
As a novice in GNN, I did not see such implementation (cascaded BNs) elsewhere. Could you please explain why you did this? Does this implementation lead to better performance than keeping only one BN in each layer?
Thank you!
Hi @zhikaili ,
Thanks for point out. However, this is in GIN's implementation and we left it unchanged. It's not any more special in GCC. Please see https://github.com/dmlc/dgl/blob/master/examples/pytorch/gin/gin.py.