Skip to content

[Feature] Add classification metric MCC #3536

@aaishwarymishra

Description

@aaishwarymishra

🚀 Feature

While current metric suite covers most metrics, it is currently missing two critical evaluators.

Matthews Correlation Coefficient (MCC): Widely considered one of the most reliable metrics for binary and multiclass classification, especially when dealing with imbalanced datasets, as it accounts for all four quadrants of the confusion matrix.

Proposed Change

I propose adding these two metrics to our standard library:

matthews_corrcoef: Calculate the MCC score, ensuring it returns a value between -1 and +1.

The implementations should ideally mirror the API of scikit-learn for consistency.
references:
MCC

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions