titu1994 / Keras-ResNeXt

Implementation of ResNeXt models from the paper Aggregated Residual Transformations for Deep Neural Networks in Keras 2.0+.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hello

greg2paris opened this issue · comments

First I want to thank you for your implementation of Squeeze-and-Excitation Networks in keras.
I tried your network and it worked very well, but I got a question.
As I underderstand, resnet is very similar to resnext.
Resnet reduces the size of the input 5 times, for exemple :
256 -> 128 , 128->64, 64->32, 32->16, 16->8.
But when I used the code you provided to create a resnext I end up with only 3 reduction of the size :
256->128, 128->64, 64->32
Is it normal?
Does this affect the precision of the network?

i used this inputs to create the network :
resnet_base = SEResNext(input_shape=input_shape,
#depth=155,
depth=56,
cardinality=64,
width=4,
weight_decay=5e-4,
include_top=False,
weights=None,
input_tensor=input_layer,
pooling=None)

This is more suitable on the SE GitHub, but it makes no difference where I answer.

The fact is that the ResNeXt implementation is a placeholder while we wait for Grouped Convolutions in Keras/Tensorflow.

Therefore the ResNext model I wrote is not based on the ImageNet version with 5 pooling layers, but the Cifar 10 version with 3 pooling layers.

However, I do not recommend you use my ResNext implementation, and stick to PyTorch or Torch versions, as they properly support grouped convolution