0% found this document useful (0 votes)
15 views11 pages

Confusion Matrix

The document provides an overview of the confusion matrix, a tool used to evaluate the performance of classification models by comparing actual and predicted values. It defines key terms such as True Positive, True Negative, False Positive, and False Negative, and explains performance measures including accuracy, precision, recall, and error rate. Additionally, it includes review questions to reinforce understanding of the concepts presented.

Uploaded by

gdcwcse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views11 pages

Confusion Matrix

The document provides an overview of the confusion matrix, a tool used to evaluate the performance of classification models by comparing actual and predicted values. It defines key terms such as True Positive, True Negative, False Positive, and False Negative, and explains performance measures including accuracy, precision, recall, and error rate. Additionally, it includes review questions to reinforce understanding of the concepts presented.

Uploaded by

gdcwcse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

CONFUSION MATRIX

Outline

 Introduction
 Confusion Matrix
 Matrix Terms
 Measure Terms
 Review Questions
 References
INTRODUCTION
• It is a table that is often used to describe the performance of a
classification model on a set of test data for which the true
values are known.
• It is a table of two dimensions; Actual Value and Predicted
Value.
• Confusion matrix, also known as an error matrix.
Cont

 It has four dimensions


 True Positive (TP)

 True Negative (TN)

 False Positive (FP)

 False Negative (FN)


PREDICTE
D
A 185 NO YES
NO 55 15 70
C
[TN [FP
T ] ]
Yes 10 10 115
U 5
[FN [TP
] ]
Matrix
Terms
• True Positives (TP) − It is the case when both
actual class & predicted class of data point is 1.
• True Negatives (TN) − It is the case when both
actual class & predicted class of data point is 0.
• False Positives (FP) − It is the case when actual
class of data point is 0 & predicted class of data
point is 1.
• False Negatives (FN) − It is the case when actual
class of data point is 1 & predicted class of data
point is 0.
Measure
Terms
• Accuracy:
– It is how close a measured value to the
actual (True) value.

Accuracy = (TP + TN) /Total

= (55+105)/185

= 0.86
Cont

•Precision:
– It is how close the measured values are to each other.

Precision = TP / Predicted Yes

= 105 / 120

= 0.87
Reca
ll
•Recall:
– It is ratio of all predicte positiv
the
predictio correctly d e
ns
Reca
ll = TP / Actual Yes
= 105 / 115
= 0.91
Cont

• Error Rate:
– It is calculated as the number of all incorrect
predictions divided by the total number of
the datasets.
– The best error rate is 0.0
– The worst error rate is 1.0.
Error Rate = 1 - Accuracy = (FN + FP) /
Total
= 1 - 0.86 =
(15 +10) /185
= 0.14
Review
Questions

 What is the use of confusion Matrix?


 How we measure the performance ?

 What is Recall? Explain.

 What is Error Rate? Explain.

 What is Precision? Explain

 How we measure the accuracy? Explain with example.

You might also like