Skip to content

Conversation

@anjali411
Copy link
Contributor

@anjali411 anjali411 commented Aug 18, 2020

Stack from ghstack:

This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf

More concretely, this PR introduces the following changes:

  1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated.
  2. Adds backward definition for torch.complex and also adds a test to verify the definition added.
  3. Updates backward for mul, sin, cos, sinh, cosh.
  4. Adds tests for all torch.real, torch.imag, torch.view_as_real, torch.view_as_complex, torch.conj.

Follow up tasks:

Differential Revision: D23655088

[ghstack-poisoned]
@anjali411 anjali411 changed the title Complex autograd logic [WIP] Complex autograd logic Aug 18, 2020
@dr-ci
Copy link

dr-ci bot commented Aug 18, 2020

💊 CI failures summary and remediations

As of commit 87ed330 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


1 job timed out:

  • binary_linux_libtorch_3_7m_cpu_devtoolset7_shared-with-deps_test

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 265 times.

@anjali411 anjali411 changed the title [WIP] Complex autograd logic [WIP] Complex gradcheck logic Aug 18, 2020
anjali411 added a commit that referenced this pull request Aug 18, 2020
ghstack-source-id: 6a2e586
Pull Request resolved: #43208
anjali411 added a commit that referenced this pull request Aug 19, 2020
ghstack-source-id: 10a83f3
Pull Request resolved: #43208
anjali411 added a commit that referenced this pull request Aug 25, 2020
ghstack-source-id: 62bde11
Pull Request resolved: #43208
anjali411 added a commit that referenced this pull request Aug 26, 2020
ghstack-source-id: 2b25ff1
Pull Request resolved: #43208
zou3519 added a commit that referenced this pull request Sep 15, 2020
To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan:
- `pytest test/test_vmap.py -v`

ghstack-source-id: f6dcd35
Pull Request resolved: #44649
zou3519 added a commit that referenced this pull request Sep 15, 2020
To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan:
- `pytest test/test_vmap.py -v`

Differential Revision: [D23685356](https://our.internmc.facebook.com/intern/diff/D23685356)

[ghstack-poisoned]
zou3519 added a commit that referenced this pull request Sep 16, 2020
To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan:
- `pytest test/test_vmap.py -v`

Differential Revision: [D23685356](https://our.internmc.facebook.com/intern/diff/D23685356)

[ghstack-poisoned]
zou3519 added a commit that referenced this pull request Sep 16, 2020
To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan:
- `pytest test/test_vmap.py -v`

Differential Revision: [D23685356](https://our.internmc.facebook.com/intern/diff/D23685356)

[ghstack-poisoned]
zou3519 added a commit that referenced this pull request Sep 16, 2020
To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan:
- `pytest test/test_vmap.py -v`

ghstack-source-id: 596cb16
Pull Request resolved: #44649
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf

More concretely, this PR introduces the following changes:
1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated.
2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added.
3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`.
4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`.

Follow up tasks:

- [ ]  Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 )

- [ ]  Add back commented test in `common_methods_invocation.py`. ( #43208 )

- [ ]  Add more special case checking for complex gradcheck to make debugging easier.

- [ ]  Update complex autograd note.

- [ ]  disable complex autograd for operators not tested for complex.

- [ ]  Re-enable tests in `test_ops.py` for complex dtype. ( #43208 )

- [ ]  Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi 

Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088)

[ghstack-poisoned]
anjali411 added a commit that referenced this pull request Sep 16, 2020
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf

More concretely, this PR introduces the following changes:
1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated.
2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added.
3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`.
4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`.

Follow up tasks:

- [ ]  Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 )

- [ ]  Add back commented test in `common_methods_invocation.py`. ( #43208 )

- [ ]  Add more special case checking for complex gradcheck to make debugging easier.

- [ ]  Update complex autograd note.

- [ ]  disable complex autograd for operators not tested for complex.

- [ ]  Re-enable tests in `test_ops.py` for complex dtype. ( #43208 )

- [ ]  Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi 

Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088)

[ghstack-poisoned]
# or pure imaginary delta
def compute_gradient(delta):
# we currently assume that the norm of delta equals eps
assert(delta == eps or delta == (eps * 1j))
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc. @ngimel

facebook-github-bot pushed a commit that referenced this pull request Sep 16, 2020
Summary:
Pull Request resolved: #44649

To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan: - `pytest test/test_vmap.py -v`

Reviewed By: anjali411

Differential Revision: D23685356

Pulled By: zou3519

fbshipit-source-id: 29e41a9296336f6d1008e3040cade4c643bf5ebf
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf

More concretely, this PR introduces the following changes:
1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated.
2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added.
3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`.
4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`.

Follow up tasks:

- [ ]  Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 )

- [ ]  Add back commented test in `common_methods_invocation.py`. ( #43208 )

- [ ]  Add more special case checking for complex gradcheck to make debugging easier.

- [ ]  Update complex autograd note.

- [ ]  disable complex autograd for operators not tested for complex.

- [ ]  Re-enable tests in `test_ops.py` for complex dtype. ( #43208 )

- [ ]  Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi 

Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088)

[ghstack-poisoned]
anjali411 added a commit that referenced this pull request Sep 16, 2020
This PR adds gradcheck for complex. The logic used for complex gradcheck is described in Section 3.5.3 here: https://arxiv.org/pdf/1701.00392.pdf

More concretely, this PR introduces the following changes:
1. Updates get_numerical_jacobian to take as input a scalar value for vector (v). Adds gradcheck logic for C -> C, C-> R, R -> C. For R -> C functions, only the real value of gradient is propagated.
2. Adds backward definition for `torch.complex` and also adds a test to verify the definition added.
3. Updates backward for `mul`, `sin`, `cos`, `sinh`, `cosh`.
4. Adds tests for all `torch.real`, `torch.imag`, `torch.view_as_real`, `torch.view_as_complex`, `torch.conj`.

Follow up tasks:

- [ ]  Add more thorough tests for R -> C cases. Specifically, add R->C test variants for functions. for e.g., `torch.mul(complex_tensor, real_tensor)` (#44744 )

- [ ]  Add back commented test in `common_methods_invocation.py`. ( #43208 )

- [ ]  Add more special case checking for complex gradcheck to make debugging easier.

- [ ]  Update complex autograd note.

- [ ]  disable complex autograd for operators not tested for complex.

- [ ]  Re-enable tests in `test_ops.py` for complex dtype. ( #43208 )

- [ ]  Re-enable `TestGradCheckOverride.test_gradcheck` cc. @hameerabbasi 

Differential Revision: [D23655088](https://our.internmc.facebook.com/intern/diff/D23655088)

[ghstack-poisoned]
xuzhao9 pushed a commit that referenced this pull request Sep 18, 2020
Summary:
Pull Request resolved: #44649

To unblock #43208, which adds "is_complex" checks to backward formulas
that are being tested for batched gradient support with vmap.

Test Plan: - `pytest test/test_vmap.py -v`

Reviewed By: anjali411

Differential Revision: D23685356

Pulled By: zou3519

fbshipit-source-id: 29e41a9296336f6d1008e3040cade4c643bf5ebf
@facebook-github-bot
Copy link
Contributor

@anjali411 merged this pull request in 9f67176.

@anjali411
Copy link
Contributor Author

@hameerabbasi this PR has been merged so would you like to create a follow-up PR to re-enable TestGradCheckOverride.test_gradcheck?

@hameerabbasi
Copy link
Collaborator

I'll work on that soon, thanks.

loadbxh pushed a commit to loadbxh/Torch that referenced this pull request Sep 23, 2020
ghstack-source-id: fdc0e91
Pull Request resolved: pytorch/pytorch#43208
@facebook-github-bot facebook-github-bot deleted the gh/anjali411/52/head branch September 24, 2020 14:21
@anjali411
Copy link
Contributor Author

I'll work on that soon, thanks.

Hi @hameerabbasi just checking in if you are still planning to work on re-enabling TestGradCheckOverride.test_gradcheck

@hameerabbasi
Copy link
Collaborator

I apologize @anjali411, that fell off the radar due to a personal emergency. I'll add it to my to-do list so it's no longer just in my head.

@anjali411
Copy link
Contributor Author

I apologize @anjali411, that fell off the radar due to a personal emergency. I'll add it to my to-do list so it's no longer just in my head.

no worries and hope all is well! feel free to add me as a reviewer once you have a PR ready with your changes.

@anjali411
Copy link
Contributor Author

anjali411 commented Oct 9, 2020

@hameerabbasi gentle ping for re-enabling TestGradCheckOverride.test_gradcheck

@hameerabbasi
Copy link
Collaborator

There's a PR up that I submitted yesterday, #45732. Requested your review on it.

@anjali411
Copy link
Contributor Author

There's a PR up that I submitted yesterday, #45732. Requested your review on it.

thanks @hameerabbasi it looks great. I totally missed your PR before, but looks like @albanD is already reviewing it :D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: complex Related to complex number support in PyTorch

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants