fortune-uwha / ml-logistic-regression-algorithm-challenge

Uwha_Fortune_Oluchi

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ML-Logistic-regression-algorithm-challenge

In this notebook I will try to implement a Logistic Regression without relying to Python's easy-to-use scikit-learn library. This notebook aims to create a Logistic Regression without the help of in-built Logistic Regression libraries to help us fully understand how Logistic Regression works in the background.

Before we start coding let us first understand or atleast try to understand the things happening at the back-end of Logistic Regression. Logistic regression is a classification algorithm, it is sometimes confusing that the term regression appears in this name even though logistic regression is actually a classification algorithm. But that's just a name for historical reasons.So let's not get confused, logistic regression is actually a classiication algorithm that we apply to settings where the label y is a discrete value, when it's either zero or one

A Logistic regression takes input and returns an output of probability, a value between 0 and 1. How does a Logistic Regression do that? With the help of a function called a logistic function or most commonly known as a sigmoid. The terms sigmoid function and logistic function are basically synonyms and mean the same thing. So the two terms are basically interchangable, and either term can be used to refer to this function.

The sigmoid function also called the logistic function, gives an ‘S’ shaped curve that can take any real-valued number and map it into a value between 0 and 1. If the curve goes to positive infinity, y predicted will become 1, and if the curve goes to negative infinity, y predicted will become 0. If the output of the sigmoid function is more than 0.5, we can classify the outcome as 1 or YES, and if it is less than 0.5, we can classify it like 0 or NO.

About

Uwha_Fortune_Oluchi


Languages

Language:Jupyter Notebook 100.0%