F-Measure
F-Measure is a metric that incorporates aspects of both precision and recall, defined as: The parameter differentially weights the importance of recall and precision , based perhaps on the needs of an application. Values of favor recall, while values of favor precision. When , the measure is simply the F-1 score, where precision and recall are equally balanced.
0
0
Tags
Data Science
Related
Confusion Matrix
ROC Curve and ROC AUC
Precision and Recall performance metrics.
F1 Score
Optimizing Criteria in Classification Problems
Satisficing Criteria in Classification Problems
Bayes error rate
What evaluation metric would you want to maximize based on the following scenario?
Recall of a Classification Model
Precision of a Classification Model
Sensitivity Analysis of a Classification Model
Learning Curve of a Classification Model
Having three evaluation metrics makes it harder for you to quickly choose between two different algorithms, and will slow down the speed with which your team can iterate. True/False?
If you had the four following models, which one would you choose based on the following accuracy, runtime, and memory size criteria?
Coverage
How to choose between precision and recall?
F-Measure
Sensitivity
F1 Score
Relation between Precision and Recall
F-Measure
F1 Score
Maro-average Precision of a Classification Model
Micro-average Precision of a Classification Model
F-Measure