Skip to content

Conversation

@eellison
Copy link
Contributor

@eellison eellison commented Jun 4, 2019

Stack from ghstack:

This PR lets us constant prop through ops that forward tuples, & through builtins that return tuples. Previously, tuples inhibited constant prop and combined with lower tuples pass caused jitter for running constant propagation on compilation.

@pytorchbot pytorchbot added oncall: jit Add this issue/PR to JIT oncall triage queue module: cpp Related to C++ API labels Jun 4, 2019
@eellison eellison requested review from driazati, jamesr66a and suo June 4, 2019 21:40
@ezyang ezyang added facebook and removed facebook labels Jun 5, 2019
@suo
Copy link
Member

suo commented Jun 5, 2019

What is stopping us from representing tuples as constants? The code would be a lot cleaner if we could do that, and just apply the same rules we have for everything else to tuples.

@eellison
Copy link
Contributor Author

eellison commented Jun 5, 2019

I think it would lead to a more complicated IR and a simplified constant propagation, which isn't necessarily a good trade off. It could also lead to worse memory if you're not careful with something like,

a = torch.zeros(10000)
b = (a, a)
c = (a, a, a)
d = (a, a, a, a)

Previously all uses of a would get pooled into a single constant, but with constants that can contain other contexts the pooling logic is much more difficult. It would also complicate the IR Parser.

if (inp) {
inputs.push_back(*inp);
} else {
inserted_tuple = false;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couldn't you return nullptr here and get rid of this boolean? Why do you need to keep trying to constify-elements if any one of them fails?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True

(*new_output)->setType(n->outputs()[i]->type());
}
n->outputs()[i]->replaceAllUsesWith(*new_output);
} else if (outputs[i].isTuple() && n->outputs().size() == 1) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does the tuple have to be the only output?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In general it's a bug for an op to not have one output. It should be either None, T, or a Tuple. So this should never be hit. But i included it because it would be weird if one of the outputs of an op was constantified, and we still kept the op in the graph. and ran it again

if (tuple_ops.count(n->kind())) {
return (
std::all_of(n->inputs().begin(), n->inputs().end(), [&](Value* v) {
return v->node()->kind() == prim::Constant || tuples.count(v);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't tuples.count(v) return true for any tuple since contains all of them regardless of whether they are constant or not

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it will only return true for tuples which we've computed the ivalue for.

eellison added 2 commits June 6, 2019 15:44
gh-metadata: pytorch pytorch 21500 gh/eellison/5/head
gh-metadata: pytorch pytorch 21501 gh/eellison/6/head
@eellison eellison force-pushed the gh/eellison/2/head branch from 7b1806c to 3ce9a9f Compare June 6, 2019 22:45
@eellison
Copy link
Contributor Author

eellison commented Jun 7, 2019

Closing in favor of #21501 since i did something to mess up ghstack here

@eellison eellison closed this Jun 7, 2019
@facebook-github-bot facebook-github-bot deleted the gh/eellison/2/head branch October 28, 2019 22:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: cpp Related to C++ API oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants