Skip to content

Support multiple simultaneous LR schedulers #13022

@jeanm

Description

@jeanm

🚀 Feature

Currently only one _LRScheduler can be used with the same optimizer. It would be great to support using multiple ones simultaneously.

Motivation

It's not uncommon in NLP (see e.g. here) to want to use both StepLR and ReduceLROnPlateau. This isn't possible currently, because StepLR calculates learning rate updates based on what was the initial learning rate, and how many epochs have gone by (see get_lr()). This is all based on the assumption the learning rate hasn't been touched by anything else.

Pitch

This can be fixed by having StepLR, ExponentialLR (and potentially others) query the optimizer to find out the current learning rate, instead of attempting to work it out what it should be. ReduceLROnPlateau already does this in _reduce_lr().

Metadata

Metadata

Assignees

Labels

module: optimizerRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions