-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
Description
The tests for torch.legacy.optim don't have "legacy" in the filename (test/optim/test.py, unlike test/test_legacy_nn.py) leading me to believe torch.optim doesn't yet deprecate the legacy ones. However, afaict, parameter/gradient flattening is no longer preferred, so torch.legacy.optim doesn't seem ideal, and torch.optim only has SGD and is missing Adagrad/Adam/RMSProp/L-BFGS/etc. I also can't find a branch or issue working on these, but #5 says optim is done.
What's the plan here?