Skip to content

inference without Volatile=True blows up memory #235

@soumith

Description

@soumith

As @colesbury says:

If you're doing inference only, you might need to set volatile=True. Otherwise you often end up with reference cycles which aren't collected immediately. (This is something we should fix in PyTorch)

See pytorch/examples#12

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions