Consider a neural network that takes two inputs, has one hidden layer with two nodes, and an output layer with one node. Let's start by randomly initializing the weights and the biases in the network. print the weights and biases.
Week 2: Consider the week 1 network and compute the following
The weighted sum.
Assuming a sigmoid activation function, let's compute the activation of the first node.
compute the activation of the second node
compute the weighted sum of these inputs to the node in the output layer
compute the output of the network as the activation of the node in the output layer
Week 3: Initialize a network with the following specification
Takes 5 inputs
has three hidden layers
has 3 nodes in the first layer, 2 nodes in the second layer, and 3 nodes in the third layer
has 1 node in the output layer
Print the network and its nodes
Week 4: Consider the Week 3 network and do the following
Change the activation of the network from sigmoid to tanh and observe the performance of the network
Compute the activation of every node in first hidden layer
Compute the activation of every node in second hidden layer
Compute the activation of every node in third hidden layer
Week 5: Consider the Week 4 network and do the following
Change the activation of the network from tanh to relu and observe the performance of the network
Compute the activation of every node in first hidden layer
Compute the activation of every node in second hidden layer
Compute the activation of every node in third hidden layer
Week 6:
Construct convolution neural network and perform the classification using MNIST dataset using K10 cross validation.
Week 7:
consider the network and dataset from week 6 visualize the hidden layers features. Compute the confusion matrix.
Week 8:
Construct CNN model with 7 layers and compute the performance of the model using Cats and Dogs dataset with K5 cross validation.
Week 9:
Construct AlexNet on MNIST dataset compute the performance evaluation matrices.
Week 10:
Construct VGG16 network, transfer the pre trained weights from Imagenet for classification of the cats and dogs.
Week 11:
Construct RNN network for MNIST dataset. Evaluate all performance metrices
Week 12:
Construct LSTM network for dogs and cats dataset, Evaluate all performance metrices.
Text book
Ian Goodfellow, YoshuaBengio and Aaron Courville, Deep Learning (1 ed.), MIT Press, 2017. ISBN 978- 0262035613.
Reference Textbook.
Charu C. Aggarwal, Neural Networks and Deep Learning (1 ed.), Springer International Publishing AG, part of Springer Nature, 2018. ISBN 978-3319944623.