-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Description
🚀 The feature, motivation and pitch
I would like to compute Jacobian-vector products for a function relating to affine image registration. Running torch.func.jvp yielded the error message in title.
Alternatives
torch.func.vjp works for my function
Additional context
NotImplementedError Traceback (most recent call last)
in
1 # trying jvp
----> 2 _, jvp_out = torch.func.jvp(affine_squared_differences, (T_A, T_b), (T_A_id, T_b_id))
~/opt/anaconda3/lib/python3.8/site-packages/torch/_functorch/eager_transforms.py in jvp(func, primals, tangents, strict, has_aux)
914 """
915
--> 916 return _jvp_with_argnums(func, primals, tangents, argnums=None, strict=strict, has_aux=has_aux)
917
918
~/opt/anaconda3/lib/python3.8/site-packages/torch/_functorch/vmap.py in fn(*args, **kwargs)
37 def fn(*args, **kwargs):
38 with torch.autograd.graph.disable_saved_tensors_hooks(message):
---> 39 return f(*args, **kwargs)
40 return fn
41
~/opt/anaconda3/lib/python3.8/site-packages/torch/_functorch/eager_transforms.py in _jvp_with_argnums(func, primals, tangents, argnums, strict, has_aux)
963 primals = _wrap_all_tensors(primals, level)
964 duals = _replace_args(primals, duals, argnums)
--> 965 result_duals = func(*duals)
966 if has_aux:
967 if not (isinstance(result_duals, tuple) and len(result_duals) == 2):
in affine_squared_differences(T_A, T_b)
11 Returns a tensor of shape (1, C, Df, Hf, Wf) recording the squared differences
12 """
---> 13 T_transformed = my_transform_image_pytorch(T_fix, T_mov, T_A, T_b)
14 return(torch.square(T_fix - T_transformed))
in my_transform_image_pytorch(T_fix, T_mov, A, b, mode, padding_mode)
67 a shape (1,Cm,Df,Hf,Wf) tensor
68 """
---> 69 grid = F.affine_grid(torch.cat((A, b[:, None]), 1)[None, :], T_fix.size())
70 return(F.grid_sample(T_mov, grid))
71
~/opt/anaconda3/lib/python3.8/site-packages/torch/nn/functional.py in affine_grid(theta, size, align_corners)
4339 raise ValueError("Expected non-zero, positive output size. Got {}".format(size))
4340
-> 4341 return torch.affine_grid_generator(theta, size, align_corners)
4342
4343
NotImplementedError: Trying to use forward AD with affine_grid_generator that does not support it because it has not been implemented yet.
Please file an issue to PyTorch at https://github.com/pytorch/pytorch/issues/new?template=feature-request.yml so that we can prioritize its implementation.
Note that forward AD support for some operators require PyTorch to be built with TorchScript and for JIT to be enabled. If the environment var PYTORCH_JIT=0 is set or if the library is not built with TorchScript, some operators may no longer be used with forward AD.