Skip to content

Conversation

@ezyang
Copy link
Contributor

@ezyang ezyang commented Nov 24, 2024

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 24, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/141444

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 36404fb with merge base dd2d0c6 (image):

UNSTABLE - The following job failed but was likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

[ghstack-poisoned]
ezyang added a commit that referenced this pull request Nov 24, 2024
Fixes #137100

Should also add a mark_oblivious API for manual control.

Signed-off-by: Edward Z. Yang <[email protected]>

ghstack-source-id: 2e81967
Pull Request resolved: #141444
# Treat the dimension statically based on its hint
STATIC = 2
# Treat the dimension as a size-like unbacked
SIZE_LIKE_UNBACKED = 3
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find this naming super confusing. Let me get this straight:

  1. torch._dynamo.mark_unbacked sets the dimension to DimDynamic.SIZE_LIKE_UNBACKED via https://www.internalfb.com/code/fbsource/[7157ca20df68]/fbcode/caffe2/torch/_dynamo/variables/builder.py?lines=2738
  2. In Something like mark_unbacked but only does size oblivious #137100, you write this will "opt into the alternate size oblivious semantics, which allows you to avoid doing specializations on 0/1 in many situations"

So it's really not SIZE_LIKE_UNBACKED, but really more like SIZE_OBLIVIOUS_UNBACKED no?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uhm, I think I might be confusing myself. So it seems like size_oblivious and size_like are actually indeed analogous concepts.

So size_oblivious seems to describe a type of analysis where we assume all size-like symbols are >=2 for purpose of analysis.

Size-like, is just an annotation that we put onto symbols that size_oblivious uses to narrow down the symbols with which we can do size oblivious analysis.

So I guess now my confusion is why we have an OBLIVIOUS_SIZE. Why not call it SIZE_LIKE_BACKED?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah.. so I guess what separates oblivious and size-like is the counter-factual reasoning you do. So SIZE_LIKE_UNBACKED does size oblivious reasoning BUT will raise when we try to guard. OBLIVIOUS_SIZE also does size oblivious reasoning BUT will only raise if the counter-factual reasoning of non-size-like vs size-like differs.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if it makes more sense to do something like OBLIVIOUS_SIZE_LIKE_UNBACKED so future readers don't need to wander through this idea maze.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SIZE_LIKE_BACKED might be OK, but it might be confusing because we do in fact allocate unbacked symbols for OBLIVIOUS_SIZE (and they just get special handling due to the oblivious_var_to_val map). OBLIVIOUS_SIZE_LIKE_UNBACKED could potentially make what is currently called SIZE_LIKE_UNBACKED more clear, although for more symmetry I would probably call it OBLIVIOUS_UNBACKED_SIZE if we weren't renaming. This is all private API so it's easy to change, just need to decide what is clear.

# Infer the strides from stride. If size is static, strides will be static as well.
INFER_STRIDE = 4
# Like SIZE_LIKE_UNBACKED, but there's a hint
OBLIVIOUS_SIZE = 5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[if what I wrote above is false, then read on, otherwise ignore]

Let me make sure I understand:

SIZE_LIKE_UNBACKED: Expands into size-like, unbacked symbolic integer. This means we do specialize on 0/1 and we don't allow guards.

OBLIVIOUS_SIZE: Expands into size-oblivious, backed symbolic integer. This means we don't specialize on 0/1 and we do allow guards.

Is this correct? If so, OBLIVIOUS_SIZE really isn't like SIZE_LIKE_UNBACKED right? If anything it's the opposite? If this is the case, maybe it'd be more clear to rename to OBLIVIOUS_SIZE_BACKED

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(see comment above)

@ezyang ezyang changed the title [POC] Add automatic_dynamic_shapes_mark_as == "oblivious" Add automatic_dynamic_shapes_mark_as == "oblivious" Dec 11, 2024
[ghstack-poisoned]
ezyang added a commit that referenced this pull request Dec 11, 2024
Fixes #137100

Should also add a mark_oblivious API for manual control.

Signed-off-by: Edward Z. Yang <[email protected]>

ghstack-source-id: e45a1c8
Pull Request resolved: #141444
@ezyang
Copy link
Contributor Author

ezyang commented Dec 11, 2024

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Dec 11, 2024
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command
For more information see pytorch-bot wiki.

@ezyang
Copy link
Contributor Author

ezyang commented Dec 11, 2024

@pytorchbot merge -f "looks fine"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@github-actions github-actions bot deleted the gh/ezyang/3011/head branch January 11, 2025 02:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/inductor ciflow/trunk Trigger trunk jobs on your pull request fx Merged module: dynamo release notes: fx release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants