wind222 / DenseNet

DenseNet implementation in Keras

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Dense Net in Keras

DenseNet implementation of the paper Densely Connected Convolutional Networks in Keras

Architecture

DenseNet is an extention to Wide Residual Networks. According to the paper:

The lth layer has l inputs, consisting of the feature maps of all preceding convolutional blocks. 
Its own feature maps are passed on to all L − l subsequent layers. This introduces L(L+1) / 2 connections 
in an L-layer network, instead of just L, as in traditional feed-forward architectures. 
Because of its dense connectivity pattern, we refer to our approach as Dense Convolutional Network (DenseNet).

It features several improvements such as :

  1. Dense connectivity : Connecting any layer to any other layer.
  2. Growth Rate parameter Which dictates how fast the number of features increase as the network becomes deeper.
  3. Consecutive functions : BatchNorm - Relu - Conv which is from the Wide ResNet paper and improvement from the ResNet paper.

Dense Nets have an architecture which can be shown in the following image from the paper:

Performance

The accuracy of DenseNet has been provided in the paper, beating all previous benchmarks in CIFAR 10, CIFAR 100 and SVHN

Usage

Provided are the weights for CIFAR 10 with the DenseNet 40 model.

  1. Run the cifar10.py script to train the DenseNet 40 model
  2. Comment out the model.fit_generator(...) line and uncomment the model.load_weights("weights/DenseNet-40-12-CIFAR10.h5") line to test the classification accuracy.

Requirements

  • Keras
  • Theano (tested) / Tensorflow (not tested, weights not provided)
  • h5Py

About

DenseNet implementation in Keras


Languages

Language:Python 100.0%