-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Redefine scheduler to set learning rate using recursive formula #14010
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
torch/optim/lr_scheduler.py
Outdated
| return [group['lr'] * self.gamma | ||
| for group in self.optimizer.param_groups] | ||
| # return [base_lr * self.gamma ** (self.last_epoch // self.step_size) | ||
| # for base_lr in self.base_lrs] |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| threshold=0.1, patience=5, cooldown=5) | ||
| self._test_reduce_lr_on_plateau(scheduler, targets, metrics, epochs) | ||
|
|
||
| def test_compound_step_and_multistep_lr(self): |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Don't forget to fix Python lint (see the Travis job.) |
| def get_lr(self): | ||
| return [base_lr * self.gamma ** (self.last_epoch // self.step_size) | ||
| for base_lr in self.base_lrs] | ||
| if (self.last_epoch == 0) or (self.last_epoch % self.step_size != 0): |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/optim/lr_scheduler.py
Outdated
| milestones = set(milestones) | ||
| #if not list(milestones) == sorted(milestones): | ||
| # raise ValueError('Milestones should be a list of' | ||
| # ' increasing integers. Got {}', milestones) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Changes LGTM. |
|
Nit: When naming a PR, name it for what it does, not a URL (that you have to click through to find out what it's about ;) |
|
Is this blocked on something? Would be great to have it merged! |
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@chandlerzuo has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@chandlerzuo is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@chandlerzuo is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter WIP" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
…ing recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter" Revert "Revert "Redefine scheduler to set learning rate using recursive formula" #14010 (#21463)" and enable closed form with non-sequential epoch parameter This reverts commit 3889855. gh-metadata: pytorch pytorch 21800 gh/vincentqb/6/head
Modified step_lr for StepLR, MultiStepLR, ExponentialLR and CosineAnnealingLR. In this way, multiple schedulers can be used simultaneously to modify the learning rates.
Related issue: #13022
Added unit tests combining multiple schedulers.