-
-
Notifications
You must be signed in to change notification settings - Fork 692
[Feature] Add classification metric MCC #3536
Copy link
Copy link
Closed
Labels
Description
🚀 Feature
While current metric suite covers most metrics, it is currently missing two critical evaluators.
Matthews Correlation Coefficient (MCC): Widely considered one of the most reliable metrics for binary and multiclass classification, especially when dealing with imbalanced datasets, as it accounts for all four quadrants of the confusion matrix.
Proposed Change
I propose adding these two metrics to our standard library:
matthews_corrcoef: Calculate the MCC score, ensuring it returns a value between -1 and +1.
The implementations should ideally mirror the API of scikit-learn for consistency.
references:
MCC
Reactions are currently unavailable