Deeksha-Pandit / Big-Data-Machine-Learning-Experiments

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Big-Data-Machine-Learning-Experiments

Task 1: Logistic Regression on PCA components

In the following code, you have to add a few lines to apply logistic regression to the PCA components. More specifically: 0. Load the Iris Dataset

  1. Implement Standard Scalar based feature extraction
  2. Implement PCA and use all components and compute the explained variance of each PCA component
  3. Implement PCA and use two components that explain maximum variance. Implement logistic regression model to train and test and give a visual display of the performance by showing a plot of the decision regions along with the test data. Print the test accuracy.

Task 2

In this exercise, we will take two features from the iris dataset and train using AdalineGD vs. AdalineSGD based on the implementation provided below and the perceptron model implementations from scikit-learn and one provided below.
You should have three plots (Adaline Rule vs. AdalineSGD vs. scikit Perceptron) along with performance comparison between these four methods. What do you observe in terms of performance difference? Please explain your interpretation.

However, you are free to use any combination of two features out of 4 features given in this dataset.

Task 3 Explore parameter tuning in Scikit Logistic regression

We have seen that the 'C' parameter in logistic regression determines the strength of logistic regression. In this exercise, you will tune the 'C' parameter to get best performance from the model for iris data classification. Write the code to run logistic regression on iris data and report the performance for 10 different values of 'C' (just Accuracy will do).

Task 4 Explore parameter tuning in SVM

Similar to the previous task, use the SVM model for various kernels: ‘linear’, ‘poly’, ‘rbf’, ‘sigmoid’ and report the performance (just Accuracy will do).

About


Languages

Language:Jupyter Notebook 100.0%