Number of parameters not reduced
zhaokai5 opened this issue · comments
zhaokai5 commented
I use your code to train my net, and I found the dw layer's parameter is same with normal convolution. do you change the blob of weight size when create DepthwiseConvolution layer?
刘灏@megvii.com commented
No, the size of dw parameter is same as origin conv layer's which has channel num's groups, which meams you must set up group nums as normal conv do when use my dw layer and you can use the origin caffemodel with no compatible price