yonghenglh6 / DepthwiseConvolution

A personal depthwise convolution layer implementation on caffe by liuhao.(only GPU)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Number of parameters not reduced

zhaokai5 opened this issue · comments

I use your code to train my net, and I found the dw layer's parameter is same with normal convolution. do you change the blob of weight size when create DepthwiseConvolution layer?

No, the size of dw parameter is same as origin conv layer's which has channel num's groups, which meams you must set up group nums as normal conv do when use my dw layer and you can use the origin caffemodel with no compatible price