kennedyCzar / ADVANCE-MACHINE-LEARNING-KERNEL-METHOD

Advance machine Learning: Kernel methods implemented for PCA, KMeans, Logistic Regression, Support Vector Machine (SVM) and Support Vector Data Description (SVDD)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ADVANCE-MACHINE-LEARNING-KERNEL-METHOD

Advance machine Learning: Kernel method implemented for PCA, KMeans, Logistic Regression (LR) and SVDD

CLASSICAL KMEANS | code | KERNEL KMEANS CODE | Paper
KMeans is a simple yet efficient unsupervised clustering algorithm. KMeans clustering is a fast, robust, and simple algorithm that gives reliable results when observations in dataset are well grouped. It is best used when the number of cluster centers, is specified due to a well-defined list of types shown in the data.
CLASSICAL LOGISTIC REGRESSION | code | KERNEL LOGISTIC REGRESSION CODE | Paper
Logistic regression solves the limitation of linear regression for categorical variable using maximum likelihood estimation of probability log function. This idea is further explained in the next sections. Our focus however is on its kernel version and how we explore the inner product of the independent variable to classify non-seperable data.
CLASSICAL SVDD | code | KERNEL SVDD CODE | Paper
Support vector data description (SVDD) is an algorithm that defines the smallest hypersphere that contains all observation used for outlier detection or classification. Support Vector Data Description (SVDD) is also a variant of Support Vector Machines (SVM), usually referred to as the One class SVM. It is interesting for use cases where researchers are only interested in the positive class of interest, therefore making it suitable to detect novelty detection.
CLASSICAL PCA | code | KERNEL PCA CODE | Paper
Principal Component Analysis (PCA) is an unsupervised dimension reduction technique that depends on the orthogonal transformation of a higher dimensional data space to a lower dimensional subspace. This implicitly means it is used for feature extraction since some of the features in the original space may not be required for projecting the data in the reduced subspace.
Classical perceptron | code | Kernel perceptron code
Perceptrons, the simplest form of neural network, are parametric nonlinear function approximators f(x; 0; ) used for classification and regression purpose. The algorithm, originally inspired by neuronal circuits in the brain. Neuron activities in the brain is responsible for reflexes and problem solving intelligence, including but not limited to navigation, planning, object recognition, visual and speech perception.
CLASSICAL SVM | code | KERNEL SVM CODE
Support Vector Machine (SVM) is a discriminative classifier formally defined by a separating hyperplane. It is a powerful classification algorithm with good theoretical guarantees.

Guassian Mixture Model | Classical GMM | kernel GMM
Guassian Mixture Component is a probabilistic soft clustering algorithm which assumes that data points can be generated by any distribution in a mixture of observed Guassians with some reasonable probabilities.

About

Advance machine Learning: Kernel methods implemented for PCA, KMeans, Logistic Regression, Support Vector Machine (SVM) and Support Vector Data Description (SVDD)

License:MIT License


Languages

Language:Python 100.0%