Ramatoutou / REGRESSION-MODEL

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

REGRESSION-MODEL

"APPLICATION DE MODÈLE D'APPRENTISSAGE SUPERVISÉ POUR LA REGRESSION : 🤔\n", "It helps in establishing a relationship among the variables by estimating how one variable affects the other\n", "\n", "\n", "REGRESSION IN MACHINE LEARNING\n", "Regression in machine learning consists of mathematical methods that allow data scientists to predict a \n", "continuous outcome (y) based on the value of one or more predictor variables (x). \n", "Linear regression is probably the most popular form of regression analysis because of its ease-of-use \n", "in predicting and forecasting.\n", "\n", "\n", "To evaluate your predictions, there are two important metrics to be considered: variance and bias.\n", "For a model to be ideal, it’s expected to have low variance, low bias and low error\n", "\n", "\n", "Linear Regression:\n", "Linear regression finds the linear relationship between the dependent variable and one or more independent variables using a best-fit straight line.\n", "(In accordance with the number of input and output variables, linear regression is divided into three types: simple linear regression $$y = mx +c$$, multiple linear regression $$y = b_0 + b_1x_1 + b_2x_2 + b_3x_3$$, and multivariate linear regression)\n", "\n", "\n", "THE BIAS-VARIANCE TRADE-OFF\n", "Bias and variance are always in a trade-off. When bias is high, the variance is low and when the variance is low, bias is high. The former case arises when the model is too simple with a fewer number of parameters and the latter when the model is complex with numerous parameters. We require both variance and bias to be as small as possible, and to get to that the trade-off needs to be dealt with carefully, then that would bubble up to the desired curve.\n", "\n", "$$y = \theta_0 + \theta_1x_1 + \theta_2x_2 + … + \theta_nx_n$$\n", "\n", "Here, $y$ is the predicted value,\n", "\n", "$n$ is the total number of input features,\n", "\n", "$x_i$ is the input feature for $i^{th}$ value, \n", "\n", "$\theta_i$ is the model parameter ($\theta_0$ is the bias and the coefficients are $\theta_1, \theta_2, … \theta_n$).\n", "\n", "\n", "\n", "\n", "\n", "\n", "1. Les k-plus proches voisins pour la regression\n", "2. Arbre de decision pour la regression\n", "3. Regression linéaires\n", "4. Regression polynomial\n", " \n", " 5. Regularisation: de Ridge\n", " 6. Regularisation: de Lasso\n", " \n", "7. Réseaux de neurones (deep learning)\n", " \n"

About


Languages

Language:Jupyter Notebook 100.0%