Harsh188 / 100_Days_of_ML

100 Day ML Challenge to learn and implement ML/DL concepts ranging from the basics to more advanced state of the art models.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

100 Days of ML

GitHub GitHub top language

Daily log to track my progress on the 100 days of ML code challenge.

Description

100 Day ML Challenge to learn and implement ML/DL concepts ranging from the basics to more advanced state of the art models.

Daily Logs

Day 1 [09/09/20]: Multivariate Linear Regression

Day 2 [10/09/20]: Applying Regression

  • Used the Seoul Bike Sharing Demand dataset found at UCI Machine Learning Repository for multivariate regression
  • Utilized the Keras library through TensorFlow.
  • Used a Sequential model with two hidden layers.

Day 3 [13/09/20]: Custom Regression Model

  • Building a custom hand tuned regression model based on previous results.
  • Trained using basic matrix operations and Adam optimizer
  • Watched Stanford's CS229 lecture on Linear Regression and Gradient Decent taught by Andrew Ng.

Day 4 [14/09/2020]: Generative Discriminative

  • Watched Stanford's CS299 lecture on GDA & Naive Bayes.
  • Noted the difference between Generative and Discriminative models.

Day 5 [15/09/20]: Naive Bayes

Day 6 [16/09/20]: Naive Bayes Project

  • Finished the Iris Flower Classifier using Naive Bayes.
  • Reached an accuracy of about 96%

Day 7 [17/09/20]: Support Vector Machines.

Day 8 [18/09/20]: SVM Project

  • Started a project on classifying Breast Cancer Tumors using SVM.
  • Followed a tutorial on youtube by Sentdex on SVM.
  • Received and accuracy in the range of around 97%

Day 9 [19/09/20]: Classification

  • Going back to the basics and approaching classification from a mathematical standpoint.
  • Completed the Classification and Representation section in the Machine Learning course by Stanford on coursera.

Day 10 [20/09/20] Kernels

  • Watched Stanford's CS299 lecture on Kernels.
  • Learned the representer theorem.

Day 11 [21/09/20] Kernels continued.

  • Finished the Stanford CS299 lecture on Kernels.
  • Learned about the complexity difference when using inner product.

Day 12 [23/09/20] Bias and Variance

Day 13 [24/09/2020] Cross-Validation

  • Finished watching the CS299 lecture on Cross Validation.
  • Learned about
    • How and when to use k-fold cross validation.
    • How and when to use leave-out-out cross validation.
    • Feature selection.

Day 14 [25/09/2020] Approx/Estimation Error

  • Watched Stanford's CS299 lecture on Approx/Estimation Error.
  • Learned about:
    • Sampling Distributions
    • Parameter View
    • Bayes Error
    • Approximation Error
    • Estimation Error
  • Day 15 [26/09/2020] Emprical Risk Minimization
    • Finished up CS299 lecture on ERM.
    • Uniform convergence

    Day 16 [27/09/2020] Decision Trees

    • Started watching Stanford's CS299 lecture on Decision Trees and Ensemble Methods.
    • Missclassificaiton and its issues with predicting the differences in certain cases.
    • How cross-entropy tackles the downfall of missclassificaiton loss.

    Day 17 [28/09/2020] Decision Trees Cont.

    • Continued Stanford's CS299 lecture on Decision Trees and Ensemble Methods.
    • Regression Trees.
    • Regularization of Decision Trees.
    • Runtime for Decision Trees.
    • Advantages and disadvantages of decision trees.

    Day 18 [29/09/2020] Ensemble Methods

    • Finished up Stanford's CS299 lecture on Decision Trees and Ensemble Methods.
    • How to combine different learning algorithms and average their results.
    • How to utilize different training sets.

    Day 19 [30/09/2020] Decision Trees Mini Project

    • Implemented decision trees on the iris dataset from UC Irvine Machine Learning Repository.
    • Received and accuracy of ~97%.

    Day 20 [01/09/2020] Neural Networks

    • Started Stanford's CS299 lecture on Introduction to Neural Networks.
    • Learned about:
      • Equational form of neurons and models.
      • Neural networks as a form of linear regression.
      • Softmax

    Day 21 [02/09/2020] Neural Networks cont.

    Day 22 [03/10/2020] Dense Neural Network Mini Project

    • Trained neural network model to classify images of clothing.
    • Utilized Fashion MNIST dataset.
    • Followed the TensorFlow guide.

    Day 23 [04/10/2020] Backprop

    Day 24 [05/10/2020] Debugging ML Models

    Day 25 [06/10/2020] Neural Networks: Representation

    • Week 4 of Machine Learning course on coursera.
    • Non-linear Hypotheses.
    • Neurons and the Brain.
    • Model representation.

    Day 26 [07/10/20] Neural Networks Mini Project 2

    • Continued Week 4 of Machine Learning course on coursera.
    • Sentiment analysis neural network classifier.
    • Utilized the IMDB dataset.

    Day 27 [08/10/20] Expectation-Maximization Algorithms

    Day 28 [09/10/20] K-Means Clustering

    Day 29 [11/10/20] K-Means Mini Project

    • Generated a random dataset for clustering.
    • Used scikit learn K-Means.

    Day 30 [12/10/20] Convolutional Neural Networks

    • Some of the things I learned today:
      • What are convolutional neural networks?
      • What is the function of the CNN kernel?

    Day 31 [13/10/20] ConvNet Cont.

    • Continued to read up on ConvNet.
    • Learned about the max pooling layer.

    Day 31 [15/10/20] CNN Mini-Project

    • Utilized the CIFAR10 dataset.
    • Followed TensorFlow's Convolutional Neural Network tutorial.

    Day 32 [16/10/20] Recurrent Neural Networks

    • Some of the things I learned today:
      • What are recurrent neural networks?
      • What makes RNNs more powerful than other architectures?

    Day 33 [17/10/20] RNNs Cont.

    • Learned about the different RNNs architectures.
    • Explored the different applications of RNNs.

    Day 34 [19/10/20] RNN Mini Project

    • Implemented RNN using keras.
    • Trained it on the IMDB reviews dataset.

    Day 34 [20/10/20] Deep Learning PC

    • Built a deep learning computer to train networks.
    • Here are the basic specs:
      • CPU: Ryzen 7 3800XT
      • GPU: Nvidia 3080 FE
      • RAM: 16GB 3600MHz

    Day 35 [21/10/20] RNN Mini Project contd.

    • Trained the model.
    • Reached final accuracy of 0.855.

    Day 36 [22/10/20] LSTM

    • Learned about:
      • Why LSTMs were made.
      • How LSTMs solved issues with RNNs

    Day 37 [23/10/20] LSTM cont.

    • Learned more about the applications of LSTMs.
    • Dove deep into the architecture end of LSTMs.

    Day 38 [25/10/20] LSTM Mini Proj

    Day 39 [26/10/20] Gated Recurrent Unit

    • Learned:
      • What are GRUs?
      • Applications of GRUs?
      • GRUs vs LSTMs.

    Day 40 [27/10/20] GRU cont.

    • Learned how to implement a GRU model using TensorFlow and Keras.
    • Started on a new mini-project to put the GRUs to use.
    • Utilized the IMB stock dataset to predict stocks.

    Day 41 [28/10/20] Hopfield Network

    • Learned about:
      • What Hopfield networks are.
      • How to use Hopfield networks.
      • How Hopfield networks improve on the RNN model.

    Day 42 [29/10/20] Boltzmann Machine

    • Learned about:
      • What Boltzmann Machines are.
      • Use cases for Boltzmann Machines
      • The architecture of a Boltzmann Machine.

    Day 43 [31/10/20] Deep Belief Networks

    • Learned about:
      • What Deep Belief Networks are.
      • The general architecture of a DBN.

    Day 44 [02/11/20] Autoencoders

    • Learned about:
      • What Autoencoder networks are.
      • How an Autoecoder functions.
      • The components that make up an Autoencoder.
      • Applications of Autoencoders.

    Day 45 [03/11/20] Autoencoders Mini-Proj

    • Utilized TensorFlow to implement autoencoders.
    • Performed image denoising on the fasion mnist dataset.

    Day 45 [04/11/20] Autoencoders Mini-Proj cont.

    • Utilized TensorFlow to implement autoencoders.
    • Performed anomaly detection on the ECG5000 dataset.

    Day 46 [05/11/20] Generative Adversarial Network

    • Learned about:
      • What generative adversarial networks are.
      • What GANs are used for.
      • The architecture of a GAN.

    Day 47 [06/11/20] Generative Adversarial Network Implementation

    • Used TensorFlow to implement GANs.
    • Utilized the MNIST dataset for generating handwritten digits

    Day 48 [07/11/20] Generative Adversarial Network Implementation Cont.

    • Continuation of the implementation I started yesterday.
    • Worked on the loss & optimizer.

    Day 49 [08/11/20] GAN Implementation cont.

    • Training the model took a lot longer than I was expecting.
    • Trained the model for 50 epochs. Each epoch took around 1.5 min.

    Day 50 [09/11/20] fast.ai course

    Day 51 [10/11/20] Model Development

    • Started Lesson 2 of the fast.ai course.
    • Learned about:
      • Project plan for model development.
      • How to create datasets.
      • Productionization of models.

    Day 52 [11/11/20] RecycleNet Project

    • Working on my research project RecycleNet.
    • Cleaned and preprocessed the images for the dataset.
    • Checkout the entire project at RecycleNet.

    Day 53 [12/11/20] TensorFlow GPU

    • Setting up TensorFlow GPU to utilize my RTX 3080.
    • Installed Docker and created a tensorflow image.
    • Started a container and ran tensorflow code on juptyer using TensorFlow GPU

    Day 54 [13/11/20] Production and Development

    • Started Lesson 3 of the fast.ai course.
    • Learned about:
      • Data augmentation using the fastai API.
      • How to create notebook apps.
      • Deploying using Binder.
      • Feedback loops and how they can affect models over time.

    Day 55 [14/11/20] TensorFlow Serving

    Day 56 [15/11/20] ResNet-50

    • Worked on my RecycleNet research project.
    • Configured to train ResNet-50 on our custom dataset.

    Day 57 [16/11/20] Stochastic Gradient Descent

    • Watched lesson 4 of the fastai course on Stochastic Gradient Descent.

    Day 58 [17/11/20] Stochastic Gradient Descent Cont.

    • Continued watching lesson for of the fastai course on Stochastic Gradient Descent.

    Day 59 [19/11/20] Chatbot

    • Started reading about chatbots using neural networks.
    • There are two types of deep learning chatbot models:
      • Retrieval-based Neural Network
      • Generation-based Neural Network

    Day 60 [20/11/20] Chatbot research

    • Read more articles on generation-based neural networks.
    • Revisited sequence to sequence models that use an encoder/decoder architecture.
    • Read the article Generative Model Chatbots which used a seq2seq to train a chatbot using several different datasets.

    Day 61 [21/11/20] DS Exam

    • Taking time off to study for my data structures midterm!
    • Learned about graphs and their similarities to a representation of a neuron.

    Day 62 [23/11/20] Research Paper

    • Started working on the research.
    • The paper consists of a custom resnet-50 and SVM model.

    Day 63 [25/11/20] Seq2Seq

    • Started reading about seq2seq models.
    • Planing on creating a chatbot mini-project soon.

    Day 64 [27/11/20] Seq2seq cont.

    Day 65 [29/11/20] ResNet-50 + SVM

    • Worked on fine tuning a custom ResNet model with my research partner.

    Day 66 [30/11/20] Seq2Seq

    • Started coding Seq2Seq model.

    Day 67 [1/12/20] Predicting using ResNet-50+SVM

    • Fine tuned parameters by implementing grid search algorithm for SVM.
    • Used the custom architecture for predicting items from the dataset.

    Day 68 [2/12/20] Research presentation

    • My partner and I presented our research project to the panel members.
    • We finished our paper titled Classification of Recyclable Waste Generated in Indian Households.
    • Looking forward to publishing our paper to an IEEE conference.
    • m

    Day 69 [3/12/20] NLP

    • Read articles about the fundamentals of natural language processing.
    • Learned about the different ways to understand text.

    Day 70 [4/12/20] Stemming

    • Started to dive deeper into NLP.
    • Learned about stemming and the applications of stemming.

    Day 71 [5/12/20] Lemmatization

    • Learned the processo of lemmatisation.
    • Explored the difference between lemmatisation and stemming.

    Day 72 [6/12/20] Recommender Systems

    • Learned about recommender systems.
    • Read about Neural Collaborative Filtering and it's application in recommender systems.

    Day 73 [7/12/20] Optimizers

    • Started reading about various optimization algorithms for training neural networks.

    Day 74 [8/12/20] Adam Optimizer

    • Dove deep into the Adam optimizer.

    Day 75 [9/12/20] Momentum Optimizer

    • Read about momentum which helps the gradient descent.

    Day 76 [10/12/20] Nesterov Accelerated Gradient

    • Continued reading about optimizers by exploring NAG.

    Day 77 [11/12/20] Adadelta

    • Explored another optimizer which monotonically reduces the learning rate.

    Day 78 [12/12/20] Adagrad

    • Learned about the Adagrad optimizer which adapts the learning rate to individual features.

    Day 79 [13/12/20] Cognitive Science

    • Started exploring about the human brain through a neuroscience perspective.
    • Read more about Donald Hoffman's case against reality.

    Day 80 [14/12/20] Parts of the brain

    • Started looking at differnt parts of the brain and how they function.
    • Trying to draw the relationship between artificial neurons and a human brain.

    Day 81 [15/12/20] GCP

    • Started a tutorial on GCP.
    • Learning how to use thier cloud services for machine learning.

    About

    100 Day ML Challenge to learn and implement ML/DL concepts ranging from the basics to more advanced state of the art models.

    License:MIT License


    Languages

    Language:Jupyter Notebook 100.0%