Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

List of metrics documented in Processor but not supported for evaluation.  #776

@johann-petrak

Description

@johann-petrak

The documentation for TextClassificationProcessor states that the parameter metric can take a list of metrics, see

:param metric: name of metric that shall be used for evaluation, e.g. "acc" or "f1_macro".

However this is not actually supported at evaluation time in evaluation.metrics.compute_metrics (see

def compute_metrics(metric, preds, labels):
)

Since this is a nice and obvious feature, I think it should be implemented.

Once implemented, it may actually predefined metrics like "acc_f1" superfluous.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions