Machine Learning Notebooks
Helpful jupyter noteboks that I compiled while learning Machine Learning and Deep Learning from various sources on the Internet.
- NumPy Basics
-
Feature Selection: Imputing missing values, Encoding, Binarizing.
-
Feature Scaling: Min-Max Scaling, Normalizing, Standardizing.
-
Feature Extraction: CountVectorizer, DictVectorizer, TfidfVectorizer.
-
Linear & Multiple Regression
-
Backward Elimination: Method of Backward Elimination, P-values.
-
Polynomial Regression
-
Support Vector Regression
-
Decision Tree Regression
-
Random Forest Regression
-
Robust Regression using Theil-Sen Regression
-
Pipelines in Scikit-Learn
-
Logistic Regression
-
Regularization
-
K Nearest Neighbors
-
Support Vector Machines
-
Naive Bayes
-
Decision Trees
-
KMeans
-
Minibatch KMeans
-
Hierarchical Clustering
-
Application of Clustering - Image Quantization
-
Application of Custering - Outlier Detection
-
Cross Validation and its types
-
Confusion Matrix, Precision, Recall
-
R Squared
-
ROC Curve, AUC
-
Silhoutte Distance
-
Apriori Algorithm
-
Eclat Model
-
Upper Confidence Bound Algorithm
-
Thompson Sampling
Natural Language Processing
- Sentiment Analysis
-
What are Activation Functions
-
Vanilla Neural Network
-
Backpropagation Derivation
-
Backpropagation in Python
-
Convolutional Neural Networks
-
Long Short Term Memory Neural Networks (LSTM)
- Machine Learning by Andrew Ng (Coursera)
- Machine Learning A-Z (Udemy)
- Deep Learning A-Z (Udemy)
- Neural Networks by Geoffrey (Hinton Coursera)
- Scikit-learn Cookbook (Second Edition) - Julian Avila et. al