Skip to content

Conversation

@li-roy
Copy link
Contributor

@li-roy li-roy commented Sep 18, 2018

Summary: The first commit uses at::Half as a replacement for caffe2::float16. float16 conversions and arithmetic removed to make use of Half equivalents. Also, as a side effect, aten_op works for gpu half now.

The second commit is a codemod (caffe2::float16 -> at::Half). Notable changes: caffe2/perfkernels/embedding_lookup.cc, the macro was changed because at::Half can't be in a function name. Same thing for the codegen in hp_emblookup_codegen.py.

Differential Revision: D9892158

@ezyang
Copy link
Contributor

ezyang commented Sep 18, 2018

Errr, this wasn't just a simple codemod, was it? Can we have a billing of changes?

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm mostly a little concerned about Python side BC-breaking changes.

@li-roy
Copy link
Contributor Author

li-roy commented Sep 18, 2018

Yeah, I'll put out a list of changes tomorrow, just wanted to start running tests tonight.

@li-roy li-roy changed the title codemod: caffe::float16 -> at::Half Replace float16 with at::Half in caffe2 Sep 18, 2018

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@li-roy
Copy link
Contributor Author

li-roy commented Sep 18, 2018

@pytorchbot retest this please.

Roy Li and others added 2 commits September 20, 2018 14:02
Summary:
- Finishes unifying Half type in pytorch and caffe2
- As a side effect, aten_op works for fp16 now
Pull Request resolved: pytorch#11676

Differential Revision: D9829019

fbshipit-source-id: e5f800024478c2e68ef29f4c06b4f0002f81a3f7
Summary:
Pull Request resolved: pytorch#11785

Replace each instead of float16 with Half.

Reviewed By: Yangqing

Differential Revision: D9892158

fbshipit-source-id: c1f8dd9233423786b4a129569d4df2d39babd58e
iotamudelta pushed a commit to ROCm/pytorch that referenced this pull request Sep 21, 2018
Summary:
Pull Request resolved: pytorch#11785

Replace each instead of float16 with Half.

Reviewed By: Yangqing

Differential Revision: D9892158

fbshipit-source-id: b9225ca7bd5c84fd1c04a9d24b026c8b6cbff120
@ezyang ezyang added the merged label Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants