Skip to content

Enforce positive sample_weight #15531

@rth

Description

@rth

As discussed in #12464 (comment) and #15358 (comment) in some specific use-cases negative sample_weight may be meaningful, however in most cases they should never happen.

So we may want to

  • add force_positive=None parameter to _check_sample_weights.
  • add a assume_positive_sample_weights=True config parameter to sklearn.set_config / get_config.

By default, force_positive=None would error on negative sample weight, but this check could be disabled globally with sklearn.set_config(assume_positive_sample_weights=False).

With _check_sample_weights(.., force_positive=True) the check would always be done irrespective of the config parameter.

If there are no objections to this, please tag this issue as "Help Wanted".

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions