Confusion about formula for H2 and W2 and D2
ArEnSc opened this issue · comments
In your code you specify that
layer_defs.push({type:'input', out_sx:32, out_sy:32, out_depth:3}); // declare size of input
// output Vol is of size 32x32x3 here
This is fine and makes sense
how ever this volume calculation is odd to me.
layer_defs.push({type:'conv', sx:5, filters:16, stride:1, pad:2, activation:'relu'});
// the layer will perform convolution with 16 kernels, each of size 5x5.
// the input will be padded with 2 pixels on all sides to make the output Vol of the same size
// output Vol will thus be 32x32x16 at this point
In your lecture notes you say
H2 = (H1 - F) + (2 * P) / S + 1
H2 = (32 - 5) + (2 * 2)/ 1 + 1
H2 = (27) + (4) / 2
H2 = 15.5
W2 = (W1 - F) + (2 * P) / S + 1
W2 = 15.5
D2 = K
D2 = 16
Resulting Volume
15.5 * 15.5 * 16
I am confused about how you arrived at 32 x 32 x 16 volume
H2 = (32 - 5) + (2 * 2)/ 1 + 1 = 32 - 5 + 4 + 1 = 32, not 15.5. This is
elementary school stuff ;)
also make sure you're not missing brackets, the formula is
H2 = ((H1 - F) + (2 * P)) / S + 1
On Sat, Aug 15, 2015 at 1:08 AM, Michael Chung notifications@github.com
wrote:
In your code you specify that
layer_defs.push({type:'input', out_sx:32, out_sy:32, out_depth:3}); // declare size of input// output Vol is of size 32x32x3 here
This is fine and makes sense
how ever this volume calculation is odd to me.
layer_defs.push({type:'conv', sx:5, filters:16, stride:1, pad:2, activation:'relu'});// the layer will perform convolution with 16 kernels, each of size 5x5.// the input will be padded with 2 pixels on all sides to make the output Vol of the same size// output Vol will thus be 32x32x16 at this point
In your lecture notes you say
H2 = (H1 - F) + (2 * P) / S + 1
H2 = (32 - 5) + (2 * 2)/ 1 + 1
H2 = (27) + (4) / 2
H2 = 15.5
W2 = (W1 - F) + (2 * P) / S + 1
W2 = 15.5D2 = K
D2 = 16
Resulting Volume
15.5 * 15.5 * 16I am confused about how you arrived at 32 x 32 x 16 volume
—
Reply to this email directly or view it on GitHub
#43.
LOL OMg I need sleep thanks hahahaha