Skip to content

Conversation

@angelayi
Copy link
Contributor

@angelayi angelayi commented Jul 24, 2024

In cases where the program takes in a constant, export will specialize on the constant and embed the constant into the graph, with the graph containing a placeholder node with no users. However, inductor errors further down as typically in torch.compile, these constants don't show up as inputs. Since these constants are already embedded in the graph, we will just ignore these inputs while compiling with AOTI, and filter out the non-tensor inputs during the runtime.

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @chauhang

@pytorch-bot
Copy link

pytorch-bot bot commented Jul 24, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/131594

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 16f315b with merge base 980bb54 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

@angelayi has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@angelayi angelayi force-pushed the angelayi/skip_consts branch from 5a9b27a to bab80fa Compare July 24, 2024 07:03
@facebook-github-bot
Copy link
Contributor

@angelayi has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@angelayi angelayi force-pushed the angelayi/skip_consts branch from bab80fa to 22cf510 Compare July 24, 2024 16:02
in_spec = pytree.treespec_loads(call_spec[0])
out_spec = pytree.treespec_loads(call_spec[1])
flat_inputs = pytree.tree_flatten((args, reorder_kwargs(kwargs, in_spec)))[0]
flat_inputs = [x for x in flat_inputs if isinstance(x, torch.Tensor)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about the C++ side? Is there a way to extend the Runner interface?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, right now it results in a type error if you pass non-tensors to the runner because the cpp interface is for a List[Tensor]. Does this mean that we should convert it to being a union of inputs?

@angelayi angelayi force-pushed the angelayi/skip_consts branch from 22cf510 to 16f315b Compare July 24, 2024 22:41
@facebook-github-bot
Copy link
Contributor

@angelayi has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@github-actions github-actions bot deleted the angelayi/skip_consts branch August 25, 2024 02:03
coconutruben added a commit to coconutruben/pytorch that referenced this pull request Sep 30, 2024
Summary:
# Why

The arguments are filtered out as they are just const in the compiled graph, but the logger still expects a non-None type

# What

When passing a filtered out arg (None) to the debug logger, just log that it's a filtered out argument, instead of throwing a Type error

# Background

pytorch#131594

Test Plan: - execute repro from pytorch#135584 (comment) with and without the edits

Differential Revision: D63652564
coconutruben added a commit to coconutruben/pytorch that referenced this pull request Sep 30, 2024
…#137032)

Summary:
Pull Request resolved: pytorch#137032

# Why

The arguments are filtered out as they are just const in the compiled graph, but the logger still expects a non-None type

# What

When passing a filtered out arg (None) to the debug logger, just log that it's a filtered out argument, instead of throwing a Type error

also add unit test to capture this behavior going forward

# Background

pytorch#131594

Test Plan: - execute repro from pytorch#135584 (comment) with and without the edits

Reviewed By: angelayi

Differential Revision: D63652564
coconutruben added a commit to coconutruben/pytorch that referenced this pull request Oct 1, 2024
…#137032)

Summary:
Pull Request resolved: pytorch#137032

# Why

The arguments are filtered out as they are just const in the compiled graph, but the logger still expects a non-None type

# What

When passing a filtered out arg (None) to the debug logger, just log that it's a filtered out argument, instead of throwing a Type error

also add unit test to capture this behavior going forward

# Background

pytorch#131594

Test Plan: - execute repro from pytorch#135584 (comment) with and without the edits

Reviewed By: angelayi

Differential Revision: D63652564
coconutruben added a commit to coconutruben/pytorch that referenced this pull request Oct 1, 2024
…#137032)

Summary:
Pull Request resolved: pytorch#137032

# Why

The arguments are filtered out as they are just const in the compiled graph, but the logger still expects a non-None type

# What

When passing a filtered out arg (None) to the debug logger, just log that it's a filtered out argument, instead of throwing a Type error

also add unit test to capture this behavior going forward

# Background

pytorch#131594

Test Plan: - execute repro from pytorch#135584 (comment) with and without the edits

Reviewed By: angelayi

Differential Revision: D63652564
pytorchmergebot pushed a commit that referenced this pull request Oct 2, 2024
Summary:
# Why

The arguments are filtered out as they are just const in the compiled graph, but the logger still expects a non-None type

# What

When passing a filtered out arg (None) to the debug logger, just log that it's a filtered out argument, instead of throwing a Type error

# Background

#131594

Test Plan: - execute repro from #135584 (comment) with and without the edits

Differential Revision: D63652564

Pull Request resolved: #137032
Approved by: https://github.com/angelayi
AnantGulati pushed a commit to AnantGulati/pytorch that referenced this pull request Oct 2, 2024
…#137032)

Summary:
# Why

The arguments are filtered out as they are just const in the compiled graph, but the logger still expects a non-None type

# What

When passing a filtered out arg (None) to the debug logger, just log that it's a filtered out argument, instead of throwing a Type error

# Background

pytorch#131594

Test Plan: - execute repro from pytorch#135584 (comment) with and without the edits

Differential Revision: D63652564

Pull Request resolved: pytorch#137032
Approved by: https://github.com/angelayi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants