Skip to content

ReduceLROnPlateau will fail when add new parameter group to the optimizer #20997

@thuwyh

Description

@thuwyh

🐛 Bug

ReduceLROnPlateau will fail when add new parameter group to the optimizer.
The _reduce_lr function will raise list index out of range error

To Reproduce

Steps to reproduce the behavior:

  1. initialize optimizer: optimizer = Adam(filter(lambda p: p.requires_grad, model.parameters()), args.lr)
  2. initialize scheduler: scheduler = ReduceLROnPlateau(optimizer, patience=1, factor=0.1, verbose=True, mode='max')
  3. add parameter group to the optimizer: optimizer.add_param_group({'params':unfreezed_params, 'lr':lr})
  4. raise error when the learning rate should be changed

Expected behavior

when add a parameter group to the optimizer, the min_lrs attribute of the scheduler should be updated to avoid this error.

Environment

  • PyTorch Version (1.1.0):

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: optimizerRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions