gultekingokhan / evaluation-metrics

Evaluation Metrics | Example code and own notes while taking the course "Intro to Machine Learning" on Udacity

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Evaluation Metrics

Evaluation Metrics | Example code and own notes while taking the course "Intro to Machine Learning" on Udacity.

accuracy = number of items in a class labeled correctly / all items in that class

Confusion matrix

confusion-matrix

Precision & Recall

Recall

recall = true positive / true positive + false negative

Out of all the items that are truly positive, how many were correctly classified as positive. Or simply, how many positive items were "recalled" from the dataset.

Precision

precision = true positive / true positive + false positive

Out of all the items labeled as positive, how many truly belong to the positive class.

There is a good visual explanation about precision and recall on Wikipedia: precisionrecallwikipedia

About

Evaluation Metrics | Example code and own notes while taking the course "Intro to Machine Learning" on Udacity