-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add float8 support in serde schema #143343
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/143343
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 New FailuresAs of commit 8b140d0 with merge base a3688ea ( NEW FAILURES - The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D67307670 |
yiming0416
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you'll need to run scripts/export/update_schema.py to update the auto generated yaml, cpp, and thrift files as well
Let's add a comment in schema.py to share this knowledge. |
Summary: Fix pytorch#141316 Bump up schema minor version. as title, add float8 support in serde schema There's also a minor change to schema check line: ``` if kind == "struct" and "default" not in d: ``` I changed the order of the two conditions, because when kind == "enum", `d` could just be an int, and `"default" not in d:` would error out with `TypeError: argument of type 'int' is not iterable`. Test Plan: ``` buck2 run 'fbcode//mode/dev-nosan' fbcode//caffe2/test:test_export -- -r test_serialize_float8 buck2 run fbcode//caffe2:export_update_schema -- --prefix /data/users/shangdiy/fbsource/fbcode/caffe2/ ``` Differential Revision: D67307670
808dac4 to
8b140d0
Compare
|
This pull request was exported from Phabricator. Differential Revision: D67307670 |
|
@yiming0416 The schema is updated now! |
|
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 2 jobs have failed, first few of them are: inductor-rocm / rocm6.2-py3.10-inductor / test (inductor, 1, 2, linux.rocm.gpu.2), inductor-rocm / rocm6.2-py3.10-inductor / test (inductor, 2, 2, linux.rocm.gpu.2) Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge -i |
Merge startedYour change will be merged while ignoring the following 2 checks: inductor-rocm / rocm6.2-py3.10-inductor / test (inductor, 1, 2, linux.rocm.gpu.2), inductor-rocm / rocm6.2-py3.10-inductor / test (inductor, 2, 2, linux.rocm.gpu.2) Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary:
Fix #141316
Bump up schema minor version.
as title, add float8 support in serde schema
Test Plan:
Differential Revision: D67307670