huawei-noah / AdderNet

Code for paper " AdderNet: Do We Really Need Multiplications in Deep Learning?"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Negative Sign in L1 Distance Equation

alarst13 opened this issue · comments

image

Hi! I wonder where did this negative sign come from?

It is introduced to ensure that if X and F are more similar, the output Y will be larger, which is same as that in CNN.

It is introduced to ensure that if X and F are more similar, the output Y will be larger, which is same as that in CNN.

I don't get it. Adding a negative sign just changes the sign, it makes the value even smaller!

It is introduced to ensure that if X and F are more similar, the output Y will be larger, which is same as that in CNN.

I don't get it. Adding a negative sign just changes the sign, it makes the value even smaller!

Sry for the misleading. As the output will have a bias, we only pay attention to the 'relative' value of different inputs. For example, if F is more similar with X1 than X2, than the output Y1 should be larger than X2.

It's clear to me now. Thank you!

Thanks! I'm closing the issue.