Skip to content

Conversation

@mikaylagawarecki
Copy link
Contributor

@mikaylagawarecki mikaylagawarecki commented Feb 18, 2025

EDIT: this is not an encompassing fix because of legacy_load, will redo

provided exploit now errors with

RuntimeError: size is inconsistent with indices: for dim 0, size is 1 but found index 4702111234474983745
during torch.load

Stack from ghstack (oldest at bottom):

@pytorch-bot
Copy link

pytorch-bot bot commented Feb 18, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/147408

Note: Links to docs will display an error until the docs builds have been completed.

⏳ No Failures, 2 Pending

As of commit a666d8b with merge base b10ba0a (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

mikaylagawarecki added a commit that referenced this pull request Feb 18, 2025
provided exploit now errors with 

RuntimeError: Storage size calculation overflowed with sizes=[2] and strides=[1]
during torch.load





[ghstack-poisoned]
provided exploit now errors with 

RuntimeError: size is inconsistent with indices: for dim 0, size is 1 but found index 4702111234474983745
during torch.load





[ghstack-poisoned]
mikaylagawarecki added a commit that referenced this pull request Feb 20, 2025
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

"size is inconsistent with indices: for dim 0, size is 1 but found index 4702111234474983745"
):
x = torch.sparse.FloatTensor(
torch.tensor([[4702111234474983745], [0]]),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: you can use 3 as the index here to avoid the weirdly large value. It should fail the same way !

@mikaylagawarecki
Copy link
Contributor Author

We actually can't do this due to the reason I explained re legacy_load, closing this PR and will reopen another one

mikaylagawarecki added a commit that referenced this pull request Feb 24, 2025
…constructor to _sparse_tensors_to_validate"

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.




[ghstack-poisoned]
mikaylagawarecki added a commit that referenced this pull request Feb 24, 2025
…sparse_tensors_to_validate"

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.




[ghstack-poisoned]
mikaylagawarecki added a commit that referenced this pull request Feb 25, 2025
…constructor to _sparse_tensors_to_validate"

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.




[ghstack-poisoned]
mikaylagawarecki added a commit that referenced this pull request Feb 25, 2025
…sparse_tensors_to_validate"

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.




[ghstack-poisoned]
mikaylagawarecki added a commit that referenced this pull request Feb 25, 2025
…constructor to _sparse_tensors_to_validate"

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.




[ghstack-poisoned]
mikaylagawarecki added a commit that referenced this pull request Feb 25, 2025
…sparse_tensors_to_validate"

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.




[ghstack-poisoned]
pytorchmergebot pushed a commit that referenced this pull request Feb 25, 2025
…ors_to_validate (#147759)

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.

Pull Request resolved: #147759
Approved by: https://github.com/albanD
aditew01 pushed a commit that referenced this pull request Feb 28, 2025
…ors_to_validate (#147759)

This is a redo of #147408 which added validation at the end of the legacy constructor calls.

The reason why I didn't land that was because in `legacy_load`, constructor would be called before storages of indices/values are set. So the tensor would not actually be validated.

Technically, torch.sparse.{Foo}Tensor should not even be called by our rebuild process since afaict this was the first PR that added support for sparse tensor serialization #27062 and it already uses `_rebuild_sparse_tensor` (which would add the rebuilt tensor to the list to validate), but torch.sparse.FooTensor is allowlisted

This PR adds tensors constructed as such to the list to validate at the end of torch.load.

Pull Request resolved: #147759
Approved by: https://github.com/albanD
@github-actions github-actions bot deleted the gh/mikaylagawarecki/318/head branch March 27, 2025 02:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants