Skip to content

More optimizers in torch.optim #175

@bshillingford

Description

@bshillingford

The tests for torch.legacy.optim don't have "legacy" in the filename (test/optim/test.py, unlike test/test_legacy_nn.py) leading me to believe torch.optim doesn't yet deprecate the legacy ones. However, afaict, parameter/gradient flattening is no longer preferred, so torch.legacy.optim doesn't seem ideal, and torch.optim only has SGD and is missing Adagrad/Adam/RMSProp/L-BFGS/etc. I also can't find a branch or issue working on these, but #5 says optim is done.

What's the plan here?

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions