-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
Description
In my code I have a target y which is computed using some Linear and Conv2d modules, and I can't do y.requires_grad = False before I pass it to a loss module.
Problem comes from:
pytorch/torch/autograd/variable.py
Lines 39 to 41 in 93d02e4
| if self.creator is not None: | |
| raise RuntimeError("you can only change requires_grad flags of " | |
| "leaf variables") |
Is there any particular reason why this restriction was implemented?
Also I was wondering what would happen if we have something like:
x -> var1 -> var2 -> var3 -> var_loss
and we call loss.backward() when every variable requires a gradient, apart from var2, for instance? Would it be like if var1.requires_grad is also set to False?
sytelus, DSLituiev, 0bara, harpone, BoPang1996 and 1 more