logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Common Performance Metrics for Classification

Reference

Precision and Recall performance metrics.

https://developers.google.com/machine-learning/crash-course/classification/precision-and-recall

0

1

Updated 2020-10-26

Contributors are:

john wisniewski
john wisniewski
🏆 1

Who are from:

University of Michigan - Ann Arbor
University of Michigan - Ann Arbor
🏆 1

Tags

Data Science

Related
  • Confusion Matrix

    Concept icon
  • ROC Curve and ROC AUC

    Concept icon
  • Precision and Recall performance metrics.

  • F1 Score

    Concept icon
  • Optimizing Criteria in Classification Problems

    Concept icon
  • Satisficing Criteria in Classification Problems

    Concept icon
  • Bayes error rate

    Concept icon
  • What evaluation metric would you want to maximize based on the following scenario?

  • Recall of a Classification Model

    Concept icon
  • Precision of a Classification Model

    Concept icon
  • Sensitivity Analysis of a Classification Model

    Concept icon
  • Learning Curve of a Classification Model

    Concept icon
  • Having three evaluation metrics makes it harder for you to quickly choose between two different algorithms, and will slow down the speed with which your team can iterate. True/False?

  • If you had the four following models, which one would you choose based on the following accuracy, runtime, and memory size criteria?

  • Coverage

    Concept icon
  • How to choose between precision and recall?

    Concept icon
  • F-Measure

    Concept icon
logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github