-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
module: optimizerRelated to torch.optimRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🚀 Feature
Currently only one _LRScheduler can be used with the same optimizer. It would be great to support using multiple ones simultaneously.
Motivation
It's not uncommon in NLP (see e.g. here) to want to use both StepLR and ReduceLROnPlateau. This isn't possible currently, because StepLR calculates learning rate updates based on what was the initial learning rate, and how many epochs have gone by (see get_lr()). This is all based on the assumption the learning rate hasn't been touched by anything else.
Pitch
This can be fixed by having StepLR, ExponentialLR (and potentially others) query the optimizer to find out the current learning rate, instead of attempting to work it out what it should be. ReduceLROnPlateau already does this in _reduce_lr().
bdusell
Metadata
Metadata
Assignees
Labels
module: optimizerRelated to torch.optimRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module