From understanding a simple perceptron to building a basic feedforward neural network for digit classification using Python and NumPy.
Development of a basic Deep Neural Network for a classification problem using PyTorch 🔥. Handle overfitting on the training set and conduct fair model selection.
Develop a Convolutional Neural Network for image classification on the CIFAR-10 dataset.
Develop a Recurrent Neural Network (RNN) for sentiment analysis using the IMDB dataset. Compare a simple RNN with more advanced recurrent models such as LSTM and GRU in terms of computational load, the number of parameters, and performance. Additionally, experiment with a bidirectional model to uncover the strengths and weaknesses of this technique. Finally, solve the same classification problem with a Transformer to gain a deeper understanding of its internal workings.
Develop a simple shallow autoencoder for dimensionality reduction. Next, create a deep version. Finally, experiment with applying autoencoders to denoising data tasks (denoising autoencoder).
Define a Variational Autoencoder (VAE) using fundamental PyTorch components. Next, establish a training loop that includes the two essential losses for VAE training: the reconstruction loss and the KL-divergence loss. Use this training loop to train the model on the MNIST dataset, selecting choosing appropriate hyperparameters. Lastly, explore and analyze the latent encodings learned by the model through various visualization techniques.