Skip to content

Add warmup_start_lr and eta_min to WarmupCosineSchedule #3989

@holgerroth

Description

@holgerroth

The current implementation of WarmupCosineSchedule will start the learning rate from 0. If step() is only called after the first epoch as here, it causes the entire first epoch to be executed with a learning rate of zero.

Alternative implementations allow the user to set a warmup_start_lr and eta_min to avoid starting from or going down to zero, respectively, for example here.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions