BIGBALLON / CIFAR-ZOO

PyTorch implementation of CNNs for CIFAR benchmark

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Layer C5 in LeNet

luo3300612 opened this issue · comments

Thanks for your repo!

A detail about LeNet-5:

Layer C5 in LeNet should be a convolutional layer as LeCun decribed in his paper:

Layer C5 is a convolutional layer with 120 feature maps.
Each unit is connected to a 5x5 neighborhood on all 16
of S4's feature maps. Here, because the size of S4 is also
5x5, the size of C5's feature maps is 1x1: this amounts
to a full connection between S4 and C5. C5 is labeled
as a convolutional layer, instead of a fully-connected layer
because if LeNet-5 input were made bigger with everything
else kept constant, the feature map dimension would be
larger than 1x1

It is nothing serious~

@luo3300612 You really careful aha.
Seriously, we need to use Conv with 5x5 kernel size according to the original paper. you can try this snippet:

class LeNet(nn.Module):
    def __init__(self, num_classes=10):
        super(LeNet, self).__init__()
        self.conv_1 = nn.Conv2d(3, 6, 5)
        self.conv_2 = nn.Conv2d(6, 16, 5)
        self.conv_3 = nn.Conv2d(16, 120, 5)
        self.fc_1 = nn.Linear(120, 84)
        self.fc_2 = nn.Linear(84, num_classes)

    def forward(self, x):
        out = F.relu(self.conv_1(x))
        out = F.max_pool2d(out, 2)
        out = F.relu(self.conv_2(out))
        out = F.max_pool2d(out, 2)
        out = F.relu(self.conv_3(out))
        out = out.view(out.size(0), -1)
        out = F.relu(self.fc_1(out))
        out = self.fc_2(out)
        return out

Of course, as you said, it is nothing serious~