Skip to content

can't change requires_grad for non leaf variables #166

@glample

Description

@glample

In my code I have a target y which is computed using some Linear and Conv2d modules, and I can't do y.requires_grad = False before I pass it to a loss module.

Problem comes from:

if self.creator is not None:
raise RuntimeError("you can only change requires_grad flags of "
"leaf variables")

Is there any particular reason why this restriction was implemented?

Also I was wondering what would happen if we have something like:
x -> var1 -> var2 -> var3 -> var_loss

and we call loss.backward() when every variable requires a gradient, apart from var2, for instance? Would it be like if var1.requires_grad is also set to False?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions