It seems that loss of fcdd is different from paper?
captainfffsama opened this issue · comments
As you can see here, the conv layers have bias per default. Thus, the final conv layer of FCDD_CNN224_VGG also has bias. This holds for all networks. Besides, per default, bias is activated in all intermediate layers as well (see here) since if one uses some sort of anomalies (may them be artificial or not), weight collapse to zero is no longer an issue.
As you can see here, the conv layers have bias per default. Thus, the final conv layer of FCDD_CNN224_VGG also has bias. This holds for all networks. Besides, per default, bias is activated in all intermediate layers as well (see here) since if one uses some sort of anomalies (may them be artificial or not), weight collapse to zero is no longer an issue.
thanks for you answer and excellent work. As you say that weight collapse to zero is easy,so when we set c
as any make sense value like c=1
, this objective function still work?
OK, I just try set c=1
,and run a train on my custom dataset, It seems that result is close to c=0
... So maybe we set c
as any make sense value, the network is still work