-
-
Notifications
You must be signed in to change notification settings - Fork 692
ReduceLROnPlateau LRScheduler error #462
Copy link
Copy link
Closed
Description
Hi,
I tried to use the LRScheduler handler for ReduceLROnPlateau from torch.optim and it threw an a TypeError:
plateau_scheduler = optim.lr_scheduler.ReduceLROnPlateau(optimizer=optimizer, mode='min', factor=0.5, patience=1)
ignite_plateau_scheduler = LRScheduler(plateau_scheduler)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-13-56cb18e5af57> in <module>
1 plateau_scheduler = optim.lr_scheduler.ReduceLROnPlateau(optimizer=optimizer, mode='min', factor=0.5, patience=1)
----> 2 ignite_plateau_scheduler = LRScheduler(plateau_scheduler)
/net/vaosl01/opt/NFS/su0/anaconda3/envs/nlpbook/lib/python3.7/site-packages/pytorch_ignite-0.2.0-py3.7.egg/ignite/contrib/handlers/param_scheduler.py in __init__(self, lr_scheduler, save_history, **kwds)
424 if not isinstance(lr_scheduler, _LRScheduler):
425 raise TypeError("Argument lr_scheduler should be a subclass of torch.optim.lr_scheduler._LRScheduler, "
--> 426 "but given {}".format(type(lr_scheduler)))
427
428 if len(lr_scheduler.optimizer.param_groups) > 1:
TypeError: Argument lr_scheduler should be a subclass of torch.optim.lr_scheduler._LRScheduler, but given <class 'torch.optim.lr_scheduler.ReduceLROnPlateau'>
The example from the documentation with StepLR works without problems. It seems that the objects belong to different base classes. This is indeed true as can be from the source here. StepLR inherits from _LRScheduler, while ReduceLROnPlateau does not have a base class (inherits from object). Is there a way to bypass that check for this scheduler?
Thanks.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels